00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-main" build number 4092 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3682 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.030 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.034 The recommended git tool is: git 00:00:00.034 using credential 00000000-0000-0000-0000-000000000002 00:00:00.038 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.050 Fetching changes from the remote Git repository 00:00:00.054 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.067 Using shallow fetch with depth 1 00:00:00.067 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.067 > git --version # timeout=10 00:00:00.090 > git --version # 'git version 2.39.2' 00:00:00.090 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.113 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.113 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.995 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.007 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.018 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:03.018 > git config core.sparsecheckout # timeout=10 00:00:03.028 > git read-tree -mu HEAD # timeout=10 00:00:03.044 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:03.064 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:03.064 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:03.141 [Pipeline] Start of Pipeline 00:00:03.155 [Pipeline] library 00:00:03.158 Loading library shm_lib@master 00:00:03.158 Library shm_lib@master is cached. Copying from home. 00:00:03.174 [Pipeline] node 00:00:03.190 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.192 [Pipeline] { 00:00:03.201 [Pipeline] catchError 00:00:03.203 [Pipeline] { 00:00:03.215 [Pipeline] wrap 00:00:03.225 [Pipeline] { 00:00:03.233 [Pipeline] stage 00:00:03.235 [Pipeline] { (Prologue) 00:00:03.446 [Pipeline] sh 00:00:03.730 + logger -p user.info -t JENKINS-CI 00:00:03.754 [Pipeline] echo 00:00:03.756 Node: WFP20 00:00:03.765 [Pipeline] sh 00:00:04.069 [Pipeline] setCustomBuildProperty 00:00:04.082 [Pipeline] echo 00:00:04.083 Cleanup processes 00:00:04.089 [Pipeline] sh 00:00:04.373 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.373 1564599 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.385 [Pipeline] sh 00:00:04.670 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.670 ++ grep -v 'sudo pgrep' 00:00:04.670 ++ awk '{print $1}' 00:00:04.670 + sudo kill -9 00:00:04.670 + true 00:00:04.684 [Pipeline] cleanWs 00:00:04.694 [WS-CLEANUP] Deleting project workspace... 00:00:04.694 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.700 [WS-CLEANUP] done 00:00:04.704 [Pipeline] setCustomBuildProperty 00:00:04.719 [Pipeline] sh 00:00:04.998 + sudo git config --global --replace-all safe.directory '*' 00:00:05.084 [Pipeline] httpRequest 00:00:05.905 [Pipeline] echo 00:00:05.907 Sorcerer 10.211.164.20 is alive 00:00:05.919 [Pipeline] retry 00:00:05.921 [Pipeline] { 00:00:05.937 [Pipeline] httpRequest 00:00:05.941 HttpMethod: GET 00:00:05.941 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.941 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.943 Response Code: HTTP/1.1 200 OK 00:00:05.944 Success: Status code 200 is in the accepted range: 200,404 00:00:05.944 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.424 [Pipeline] } 00:00:06.436 [Pipeline] // retry 00:00:06.442 [Pipeline] sh 00:00:06.721 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.734 [Pipeline] httpRequest 00:00:07.280 [Pipeline] echo 00:00:07.282 Sorcerer 10.211.164.20 is alive 00:00:07.290 [Pipeline] retry 00:00:07.292 [Pipeline] { 00:00:07.300 [Pipeline] httpRequest 00:00:07.303 HttpMethod: GET 00:00:07.304 URL: http://10.211.164.20/packages/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:00:07.304 Sending request to url: http://10.211.164.20/packages/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:00:07.306 Response Code: HTTP/1.1 200 OK 00:00:07.306 Success: Status code 200 is in the accepted range: 200,404 00:00:07.306 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:00:33.558 [Pipeline] } 00:00:33.577 [Pipeline] // retry 00:00:33.586 [Pipeline] sh 00:00:33.870 + tar --no-same-owner -xf spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:00:36.419 [Pipeline] sh 00:00:36.703 + git -C spdk log --oneline -n5 00:00:36.703 35cd3e84d bdev/part: Pass through dif_check_flags via dif_check_flags_exclude_mask 00:00:36.703 01a2c4855 bdev/passthru: Pass through dif_check_flags via dif_check_flags_exclude_mask 00:00:36.703 9094b9600 bdev: Assert to check if I/O pass dif_check_flags not enabled by bdev 00:00:36.703 2e10c84c8 nvmf: Expose DIF type of namespace to host again 00:00:36.703 38b931b23 nvmf: Set bdev_ext_io_opts::dif_check_flags_exclude_mask for read/write 00:00:36.722 [Pipeline] withCredentials 00:00:36.733 > git --version # timeout=10 00:00:36.748 > git --version # 'git version 2.39.2' 00:00:36.765 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:36.767 [Pipeline] { 00:00:36.777 [Pipeline] retry 00:00:36.780 [Pipeline] { 00:00:36.795 [Pipeline] sh 00:00:37.079 + git ls-remote http://dpdk.org/git/dpdk main 00:00:37.656 [Pipeline] } 00:00:37.673 [Pipeline] // retry 00:00:37.678 [Pipeline] } 00:00:37.694 [Pipeline] // withCredentials 00:00:37.704 [Pipeline] httpRequest 00:00:38.090 [Pipeline] echo 00:00:38.092 Sorcerer 10.211.164.20 is alive 00:00:38.104 [Pipeline] retry 00:00:38.107 [Pipeline] { 00:00:38.124 [Pipeline] httpRequest 00:00:38.129 HttpMethod: GET 00:00:38.130 URL: http://10.211.164.20/packages/dpdk_8750576fb2a9a067ffbcce4bab6481f3bfa47097.tar.gz 00:00:38.130 Sending request to url: http://10.211.164.20/packages/dpdk_8750576fb2a9a067ffbcce4bab6481f3bfa47097.tar.gz 00:00:38.146 Response Code: HTTP/1.1 200 OK 00:00:38.147 Success: Status code 200 is in the accepted range: 200,404 00:00:38.147 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_8750576fb2a9a067ffbcce4bab6481f3bfa47097.tar.gz 00:01:15.342 [Pipeline] } 00:01:15.359 [Pipeline] // retry 00:01:15.366 [Pipeline] sh 00:01:15.649 + tar --no-same-owner -xf dpdk_8750576fb2a9a067ffbcce4bab6481f3bfa47097.tar.gz 00:01:17.037 [Pipeline] sh 00:01:17.332 + git -C dpdk log --oneline -n5 00:01:17.332 8750576fb2 doc: reword some sample app guides 00:01:17.332 c0f5a9dd74 doc: fix grammar and phrasing in multi-process app guide 00:01:17.332 b456bf5006 usertools/devbind: fix NUMA node display 00:01:17.332 828fe9de4c usertools/devbind: restore active marker 00:01:17.332 497cf54829 dts: remove nested html directory for API doc 00:01:17.342 [Pipeline] } 00:01:17.354 [Pipeline] // stage 00:01:17.363 [Pipeline] stage 00:01:17.365 [Pipeline] { (Prepare) 00:01:17.388 [Pipeline] writeFile 00:01:17.403 [Pipeline] sh 00:01:17.683 + logger -p user.info -t JENKINS-CI 00:01:17.694 [Pipeline] sh 00:01:17.975 + logger -p user.info -t JENKINS-CI 00:01:17.988 [Pipeline] sh 00:01:18.271 + cat autorun-spdk.conf 00:01:18.272 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:18.272 SPDK_RUN_UBSAN=1 00:01:18.272 SPDK_TEST_FUZZER=1 00:01:18.272 SPDK_TEST_FUZZER_SHORT=1 00:01:18.272 SPDK_TEST_SETUP=1 00:01:18.272 SPDK_TEST_NATIVE_DPDK=main 00:01:18.272 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:18.279 RUN_NIGHTLY=1 00:01:18.285 [Pipeline] readFile 00:01:18.310 [Pipeline] withEnv 00:01:18.313 [Pipeline] { 00:01:18.325 [Pipeline] sh 00:01:18.609 + set -ex 00:01:18.609 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:18.609 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:18.609 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:18.609 ++ SPDK_RUN_UBSAN=1 00:01:18.609 ++ SPDK_TEST_FUZZER=1 00:01:18.609 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:18.609 ++ SPDK_TEST_SETUP=1 00:01:18.609 ++ SPDK_TEST_NATIVE_DPDK=main 00:01:18.609 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:18.609 ++ RUN_NIGHTLY=1 00:01:18.609 + case $SPDK_TEST_NVMF_NICS in 00:01:18.609 + DRIVERS= 00:01:18.609 + [[ -n '' ]] 00:01:18.609 + exit 0 00:01:18.619 [Pipeline] } 00:01:18.634 [Pipeline] // withEnv 00:01:18.639 [Pipeline] } 00:01:18.654 [Pipeline] // stage 00:01:18.664 [Pipeline] catchError 00:01:18.666 [Pipeline] { 00:01:18.680 [Pipeline] timeout 00:01:18.680 Timeout set to expire in 30 min 00:01:18.682 [Pipeline] { 00:01:18.697 [Pipeline] stage 00:01:18.700 [Pipeline] { (Tests) 00:01:18.715 [Pipeline] sh 00:01:18.999 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:18.999 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:18.999 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:18.999 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:18.999 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:18.999 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:18.999 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:18.999 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:18.999 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:18.999 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:18.999 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:18.999 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:18.999 + source /etc/os-release 00:01:18.999 ++ NAME='Fedora Linux' 00:01:18.999 ++ VERSION='39 (Cloud Edition)' 00:01:18.999 ++ ID=fedora 00:01:18.999 ++ VERSION_ID=39 00:01:18.999 ++ VERSION_CODENAME= 00:01:18.999 ++ PLATFORM_ID=platform:f39 00:01:18.999 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:18.999 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:18.999 ++ LOGO=fedora-logo-icon 00:01:18.999 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:18.999 ++ HOME_URL=https://fedoraproject.org/ 00:01:18.999 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:18.999 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:18.999 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:18.999 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:18.999 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:18.999 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:18.999 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:18.999 ++ SUPPORT_END=2024-11-12 00:01:18.999 ++ VARIANT='Cloud Edition' 00:01:18.999 ++ VARIANT_ID=cloud 00:01:18.999 + uname -a 00:01:18.999 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:18.999 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:22.286 Hugepages 00:01:22.286 node hugesize free / total 00:01:22.286 node0 1048576kB 0 / 0 00:01:22.286 node0 2048kB 0 / 0 00:01:22.286 node1 1048576kB 0 / 0 00:01:22.286 node1 2048kB 0 / 0 00:01:22.286 00:01:22.286 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:22.286 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:22.286 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:22.286 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:22.286 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:22.286 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:22.286 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:22.286 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:22.286 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:22.286 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:22.286 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:22.286 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:22.286 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:22.286 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:22.286 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:22.286 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:22.286 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:22.286 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:22.286 + rm -f /tmp/spdk-ld-path 00:01:22.286 + source autorun-spdk.conf 00:01:22.286 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:22.286 ++ SPDK_RUN_UBSAN=1 00:01:22.286 ++ SPDK_TEST_FUZZER=1 00:01:22.286 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:22.286 ++ SPDK_TEST_SETUP=1 00:01:22.286 ++ SPDK_TEST_NATIVE_DPDK=main 00:01:22.286 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:22.286 ++ RUN_NIGHTLY=1 00:01:22.286 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:22.286 + [[ -n '' ]] 00:01:22.286 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:22.286 + for M in /var/spdk/build-*-manifest.txt 00:01:22.286 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:22.286 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:22.286 + for M in /var/spdk/build-*-manifest.txt 00:01:22.286 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:22.286 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:22.286 + for M in /var/spdk/build-*-manifest.txt 00:01:22.286 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:22.286 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:22.286 ++ uname 00:01:22.286 + [[ Linux == \L\i\n\u\x ]] 00:01:22.286 + sudo dmesg -T 00:01:22.286 + sudo dmesg --clear 00:01:22.286 + dmesg_pid=1565532 00:01:22.286 + [[ Fedora Linux == FreeBSD ]] 00:01:22.286 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:22.286 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:22.286 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:22.286 + [[ -x /usr/src/fio-static/fio ]] 00:01:22.286 + export FIO_BIN=/usr/src/fio-static/fio 00:01:22.286 + FIO_BIN=/usr/src/fio-static/fio 00:01:22.286 + sudo dmesg -Tw 00:01:22.286 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:22.286 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:22.286 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:22.286 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:22.286 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:22.286 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:22.286 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:22.286 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:22.286 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:22.286 15:38:30 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:01:22.286 15:38:30 -- spdk/autorun.sh@20 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:22.286 15:38:30 -- short-fuzz-phy-autotest/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:22.286 15:38:30 -- short-fuzz-phy-autotest/autorun-spdk.conf@2 -- $ SPDK_RUN_UBSAN=1 00:01:22.286 15:38:30 -- short-fuzz-phy-autotest/autorun-spdk.conf@3 -- $ SPDK_TEST_FUZZER=1 00:01:22.286 15:38:30 -- short-fuzz-phy-autotest/autorun-spdk.conf@4 -- $ SPDK_TEST_FUZZER_SHORT=1 00:01:22.286 15:38:30 -- short-fuzz-phy-autotest/autorun-spdk.conf@5 -- $ SPDK_TEST_SETUP=1 00:01:22.286 15:38:30 -- short-fuzz-phy-autotest/autorun-spdk.conf@6 -- $ SPDK_TEST_NATIVE_DPDK=main 00:01:22.286 15:38:30 -- short-fuzz-phy-autotest/autorun-spdk.conf@7 -- $ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:22.286 15:38:30 -- short-fuzz-phy-autotest/autorun-spdk.conf@8 -- $ RUN_NIGHTLY=1 00:01:22.286 15:38:30 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:01:22.286 15:38:30 -- spdk/autorun.sh@25 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autobuild.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:22.286 15:38:30 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:01:22.286 15:38:30 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:22.286 15:38:30 -- scripts/common.sh@15 -- $ shopt -s extglob 00:01:22.286 15:38:30 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:22.286 15:38:30 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:22.287 15:38:30 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:22.287 15:38:30 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:22.287 15:38:30 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:22.287 15:38:30 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:22.287 15:38:30 -- paths/export.sh@5 -- $ export PATH 00:01:22.287 15:38:30 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:22.545 15:38:30 -- common/autobuild_common.sh@492 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:22.545 15:38:30 -- common/autobuild_common.sh@493 -- $ date +%s 00:01:22.545 15:38:30 -- common/autobuild_common.sh@493 -- $ mktemp -dt spdk_1732977510.XXXXXX 00:01:22.545 15:38:30 -- common/autobuild_common.sh@493 -- $ SPDK_WORKSPACE=/tmp/spdk_1732977510.PduW7R 00:01:22.545 15:38:30 -- common/autobuild_common.sh@495 -- $ [[ -n '' ]] 00:01:22.545 15:38:30 -- common/autobuild_common.sh@499 -- $ '[' -n main ']' 00:01:22.545 15:38:30 -- common/autobuild_common.sh@500 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:22.545 15:38:30 -- common/autobuild_common.sh@500 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:22.545 15:38:30 -- common/autobuild_common.sh@506 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:22.545 15:38:30 -- common/autobuild_common.sh@508 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:22.545 15:38:30 -- common/autobuild_common.sh@509 -- $ get_config_params 00:01:22.545 15:38:30 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:01:22.545 15:38:30 -- common/autotest_common.sh@10 -- $ set +x 00:01:22.545 15:38:30 -- common/autobuild_common.sh@509 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:22.545 15:38:30 -- common/autobuild_common.sh@511 -- $ start_monitor_resources 00:01:22.545 15:38:30 -- pm/common@17 -- $ local monitor 00:01:22.545 15:38:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:22.545 15:38:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:22.545 15:38:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:22.545 15:38:30 -- pm/common@21 -- $ date +%s 00:01:22.545 15:38:30 -- pm/common@21 -- $ date +%s 00:01:22.545 15:38:30 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:22.545 15:38:30 -- pm/common@25 -- $ sleep 1 00:01:22.545 15:38:30 -- pm/common@21 -- $ date +%s 00:01:22.545 15:38:30 -- pm/common@21 -- $ date +%s 00:01:22.545 15:38:30 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732977510 00:01:22.545 15:38:30 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732977510 00:01:22.545 15:38:30 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732977510 00:01:22.545 15:38:30 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732977510 00:01:22.545 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732977510_collect-vmstat.pm.log 00:01:22.545 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732977510_collect-cpu-load.pm.log 00:01:22.545 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732977510_collect-cpu-temp.pm.log 00:01:22.545 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732977510_collect-bmc-pm.bmc.pm.log 00:01:23.482 15:38:31 -- common/autobuild_common.sh@512 -- $ trap stop_monitor_resources EXIT 00:01:23.482 15:38:31 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:23.482 15:38:31 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:23.482 15:38:31 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:23.482 15:38:31 -- spdk/autobuild.sh@16 -- $ date -u 00:01:23.482 Sat Nov 30 02:38:31 PM UTC 2024 00:01:23.482 15:38:31 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:23.482 v25.01-pre-276-g35cd3e84d 00:01:23.482 15:38:31 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:23.482 15:38:31 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:23.482 15:38:31 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:23.482 15:38:31 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:01:23.482 15:38:31 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:23.482 15:38:31 -- common/autotest_common.sh@10 -- $ set +x 00:01:23.482 ************************************ 00:01:23.482 START TEST ubsan 00:01:23.482 ************************************ 00:01:23.482 15:38:31 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:01:23.482 using ubsan 00:01:23.482 00:01:23.482 real 0m0.000s 00:01:23.482 user 0m0.000s 00:01:23.482 sys 0m0.000s 00:01:23.482 15:38:31 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:01:23.482 15:38:31 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:23.482 ************************************ 00:01:23.482 END TEST ubsan 00:01:23.482 ************************************ 00:01:23.482 15:38:31 -- spdk/autobuild.sh@27 -- $ '[' -n main ']' 00:01:23.482 15:38:31 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:23.482 15:38:31 -- common/autobuild_common.sh@449 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:23.482 15:38:31 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:01:23.482 15:38:31 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:23.482 15:38:31 -- common/autotest_common.sh@10 -- $ set +x 00:01:23.740 ************************************ 00:01:23.740 START TEST build_native_dpdk 00:01:23.740 ************************************ 00:01:23.740 15:38:31 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:23.740 15:38:31 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:23.741 8750576fb2 doc: reword some sample app guides 00:01:23.741 c0f5a9dd74 doc: fix grammar and phrasing in multi-process app guide 00:01:23.741 b456bf5006 usertools/devbind: fix NUMA node display 00:01:23.741 828fe9de4c usertools/devbind: restore active marker 00:01:23.741 497cf54829 dts: remove nested html directory for API doc 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=24.11.0-rc4 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@102 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base" "power/acpi" "power/amd_pstate" "power/cppc" "power/intel_pstate" "power/intel_uncore" "power/kvm_vm") 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@103 -- $ local mlx5_libs_added=n 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@146 -- $ [[ 0 -eq 1 ]] 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@174 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@175 -- $ uname -s 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@175 -- $ '[' Linux = Linux ']' 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 24.11.0-rc4 21.11.0 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 24.11.0-rc4 '<' 21.11.0 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@180 -- $ patch -p1 00:01:23.741 patching file config/rte_config.h 00:01:23.741 Hunk #1 succeeded at 72 (offset 13 lines). 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@183 -- $ lt 24.11.0-rc4 24.07.0 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 24.11.0-rc4 '<' 24.07.0 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v++ )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 11 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=11 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 11 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 11 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=11 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 07 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=07 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 07 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 7 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=7 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:01:23.741 15:38:31 build_native_dpdk -- common/autobuild_common.sh@186 -- $ ge 24.11.0-rc4 24.07.0 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 24.11.0-rc4 '>=' 24.07.0 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v++ )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 11 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=11 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 11 =~ ^[0-9]+$ ]] 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 11 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=11 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 07 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@353 -- $ local d=07 00:01:23.741 15:38:31 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 07 =~ ^[0-9]+$ ]] 00:01:23.742 15:38:31 build_native_dpdk -- scripts/common.sh@355 -- $ echo 7 00:01:23.742 15:38:31 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=7 00:01:23.742 15:38:31 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:23.742 15:38:31 build_native_dpdk -- scripts/common.sh@367 -- $ return 0 00:01:23.742 15:38:31 build_native_dpdk -- common/autobuild_common.sh@187 -- $ patch -p1 00:01:23.742 patching file drivers/bus/pci/linux/pci_uio.c 00:01:23.742 15:38:31 build_native_dpdk -- common/autobuild_common.sh@190 -- $ dpdk_kmods=false 00:01:23.742 15:38:31 build_native_dpdk -- common/autobuild_common.sh@191 -- $ uname -s 00:01:23.742 15:38:31 build_native_dpdk -- common/autobuild_common.sh@191 -- $ '[' Linux = FreeBSD ']' 00:01:23.742 15:38:31 build_native_dpdk -- common/autobuild_common.sh@195 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base power/acpi power/amd_pstate power/cppc power/intel_pstate power/intel_uncore power/kvm_vm 00:01:23.742 15:38:31 build_native_dpdk -- common/autobuild_common.sh@195 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:01:29.010 The Meson build system 00:01:29.010 Version: 1.5.0 00:01:29.010 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:29.010 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:29.010 Build type: native build 00:01:29.010 Project name: DPDK 00:01:29.010 Project version: 24.11.0-rc4 00:01:29.010 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:29.010 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:29.010 Host machine cpu family: x86_64 00:01:29.010 Host machine cpu: x86_64 00:01:29.010 Message: ## Building in Developer Mode ## 00:01:29.010 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:29.010 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:29.010 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:29.010 Program python3 (elftools) found: YES (/usr/bin/python3) modules: elftools 00:01:29.010 Program cat found: YES (/usr/bin/cat) 00:01:29.010 config/meson.build:122: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:29.010 Compiler for C supports arguments -march=native: YES 00:01:29.010 Checking for size of "void *" : 8 00:01:29.010 Checking for size of "void *" : 8 (cached) 00:01:29.010 Compiler for C supports link arguments -Wl,--undefined-version: YES 00:01:29.010 Library m found: YES 00:01:29.010 Library numa found: YES 00:01:29.010 Has header "numaif.h" : YES 00:01:29.010 Library fdt found: NO 00:01:29.010 Library execinfo found: NO 00:01:29.010 Has header "execinfo.h" : YES 00:01:29.010 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:29.010 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:29.010 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:29.010 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:29.010 Run-time dependency openssl found: YES 3.1.1 00:01:29.010 Run-time dependency libpcap found: YES 1.10.4 00:01:29.010 Has header "pcap.h" with dependency libpcap: YES 00:01:29.010 Compiler for C supports arguments -Wcast-qual: YES 00:01:29.010 Compiler for C supports arguments -Wdeprecated: YES 00:01:29.010 Compiler for C supports arguments -Wformat: YES 00:01:29.010 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:29.010 Compiler for C supports arguments -Wformat-security: NO 00:01:29.010 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:29.010 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:29.010 Compiler for C supports arguments -Wnested-externs: YES 00:01:29.010 Compiler for C supports arguments -Wold-style-definition: YES 00:01:29.010 Compiler for C supports arguments -Wpointer-arith: YES 00:01:29.010 Compiler for C supports arguments -Wsign-compare: YES 00:01:29.010 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:29.010 Compiler for C supports arguments -Wundef: YES 00:01:29.010 Compiler for C supports arguments -Wwrite-strings: YES 00:01:29.010 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:29.010 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:29.011 Program objdump found: YES (/usr/bin/objdump) 00:01:29.011 Compiler for C supports arguments -mavx512f -mavx512vl -mavx512dq -mavx512bw: YES 00:01:29.011 Checking if "AVX512 checking" compiles: YES 00:01:29.011 Fetching value of define "__AVX512F__" : 1 00:01:29.011 Fetching value of define "__AVX512BW__" : 1 00:01:29.011 Fetching value of define "__AVX512DQ__" : 1 00:01:29.011 Fetching value of define "__AVX512VL__" : 1 00:01:29.011 Fetching value of define "__SSE4_2__" : 1 00:01:29.011 Fetching value of define "__AES__" : 1 00:01:29.011 Fetching value of define "__AVX__" : 1 00:01:29.011 Fetching value of define "__AVX2__" : 1 00:01:29.011 Fetching value of define "__AVX512BW__" : 1 00:01:29.011 Fetching value of define "__AVX512CD__" : 1 00:01:29.011 Fetching value of define "__AVX512DQ__" : 1 00:01:29.011 Fetching value of define "__AVX512F__" : 1 00:01:29.011 Fetching value of define "__AVX512VL__" : 1 00:01:29.011 Fetching value of define "__PCLMUL__" : 1 00:01:29.011 Fetching value of define "__RDRND__" : 1 00:01:29.011 Fetching value of define "__RDSEED__" : 1 00:01:29.011 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:29.011 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:29.011 Message: lib/log: Defining dependency "log" 00:01:29.011 Message: lib/kvargs: Defining dependency "kvargs" 00:01:29.011 Message: lib/argparse: Defining dependency "argparse" 00:01:29.011 Message: lib/telemetry: Defining dependency "telemetry" 00:01:29.011 Checking for function "pthread_attr_setaffinity_np" : YES 00:01:29.011 Checking for function "getentropy" : NO 00:01:29.011 Message: lib/eal: Defining dependency "eal" 00:01:29.011 Message: lib/ptr_compress: Defining dependency "ptr_compress" 00:01:29.011 Message: lib/ring: Defining dependency "ring" 00:01:29.011 Message: lib/rcu: Defining dependency "rcu" 00:01:29.011 Message: lib/mempool: Defining dependency "mempool" 00:01:29.011 Message: lib/mbuf: Defining dependency "mbuf" 00:01:29.011 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:29.011 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:29.011 Compiler for C supports arguments -mpclmul: YES 00:01:29.011 Compiler for C supports arguments -maes: YES 00:01:29.011 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:29.011 Message: lib/net: Defining dependency "net" 00:01:29.011 Message: lib/meter: Defining dependency "meter" 00:01:29.011 Message: lib/ethdev: Defining dependency "ethdev" 00:01:29.011 Message: lib/pci: Defining dependency "pci" 00:01:29.011 Message: lib/cmdline: Defining dependency "cmdline" 00:01:29.011 Message: lib/metrics: Defining dependency "metrics" 00:01:29.011 Message: lib/hash: Defining dependency "hash" 00:01:29.011 Message: lib/timer: Defining dependency "timer" 00:01:29.011 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:29.011 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:29.011 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:29.011 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:29.011 Message: lib/acl: Defining dependency "acl" 00:01:29.011 Message: lib/bbdev: Defining dependency "bbdev" 00:01:29.011 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:29.011 Run-time dependency libelf found: YES 0.191 00:01:29.011 Message: lib/bpf: Defining dependency "bpf" 00:01:29.011 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:29.011 Message: lib/compressdev: Defining dependency "compressdev" 00:01:29.011 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:29.011 Message: lib/distributor: Defining dependency "distributor" 00:01:29.011 Message: lib/dmadev: Defining dependency "dmadev" 00:01:29.011 Message: lib/efd: Defining dependency "efd" 00:01:29.011 Message: lib/eventdev: Defining dependency "eventdev" 00:01:29.011 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:29.011 Message: lib/gpudev: Defining dependency "gpudev" 00:01:29.011 Message: lib/gro: Defining dependency "gro" 00:01:29.011 Message: lib/gso: Defining dependency "gso" 00:01:29.011 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:29.011 Message: lib/jobstats: Defining dependency "jobstats" 00:01:29.011 Message: lib/latencystats: Defining dependency "latencystats" 00:01:29.011 Message: lib/lpm: Defining dependency "lpm" 00:01:29.011 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:29.011 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:29.011 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:29.011 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:29.011 Message: lib/member: Defining dependency "member" 00:01:29.011 Message: lib/pcapng: Defining dependency "pcapng" 00:01:29.011 Message: lib/power: Defining dependency "power" 00:01:29.011 Message: lib/rawdev: Defining dependency "rawdev" 00:01:29.011 Message: lib/regexdev: Defining dependency "regexdev" 00:01:29.011 Message: lib/mldev: Defining dependency "mldev" 00:01:29.011 Message: lib/rib: Defining dependency "rib" 00:01:29.011 Message: lib/reorder: Defining dependency "reorder" 00:01:29.011 Message: lib/sched: Defining dependency "sched" 00:01:29.011 Message: lib/security: Defining dependency "security" 00:01:29.011 Message: lib/stack: Defining dependency "stack" 00:01:29.011 Has header "linux/userfaultfd.h" : YES 00:01:29.011 Has header "linux/vduse.h" : YES 00:01:29.011 Message: lib/vhost: Defining dependency "vhost" 00:01:29.011 Message: lib/ipsec: Defining dependency "ipsec" 00:01:29.011 Message: lib/pdcp: Defining dependency "pdcp" 00:01:29.011 Message: lib/fib: Defining dependency "fib" 00:01:29.011 Message: lib/port: Defining dependency "port" 00:01:29.011 Message: lib/pdump: Defining dependency "pdump" 00:01:29.011 Message: lib/table: Defining dependency "table" 00:01:29.011 Message: lib/pipeline: Defining dependency "pipeline" 00:01:29.011 Message: lib/graph: Defining dependency "graph" 00:01:29.011 Message: lib/node: Defining dependency "node" 00:01:29.011 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:29.011 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:29.011 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:29.011 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:29.011 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:29.011 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:29.011 Compiler for C supports arguments -Wno-unused-value: YES 00:01:29.011 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:29.011 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:29.011 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:29.011 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:29.270 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:29.270 Message: drivers/power/acpi: Defining dependency "power_acpi" 00:01:29.270 Message: drivers/power/amd_pstate: Defining dependency "power_amd_pstate" 00:01:29.270 Message: drivers/power/cppc: Defining dependency "power_cppc" 00:01:29.270 Message: drivers/power/intel_pstate: Defining dependency "power_intel_pstate" 00:01:29.270 Message: drivers/power/intel_uncore: Defining dependency "power_intel_uncore" 00:01:29.270 Message: drivers/power/kvm_vm: Defining dependency "power_kvm_vm" 00:01:29.270 Has header "sys/epoll.h" : YES 00:01:29.270 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:29.270 Configuring doxy-api-html.conf using configuration 00:01:29.270 Configuring doxy-api-man.conf using configuration 00:01:29.270 Program mandb found: YES (/usr/bin/mandb) 00:01:29.270 Program sphinx-build found: NO 00:01:29.270 Program sphinx-build found: NO 00:01:29.270 Configuring rte_build_config.h using configuration 00:01:29.270 Message: 00:01:29.270 ================= 00:01:29.270 Applications Enabled 00:01:29.270 ================= 00:01:29.270 00:01:29.270 apps: 00:01:29.270 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:29.270 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:29.270 test-pmd, test-regex, test-sad, test-security-perf, 00:01:29.270 00:01:29.270 Message: 00:01:29.270 ================= 00:01:29.270 Libraries Enabled 00:01:29.270 ================= 00:01:29.270 00:01:29.270 libs: 00:01:29.270 log, kvargs, argparse, telemetry, eal, ptr_compress, ring, rcu, 00:01:29.270 mempool, mbuf, net, meter, ethdev, pci, cmdline, metrics, 00:01:29.270 hash, timer, acl, bbdev, bitratestats, bpf, cfgfile, compressdev, 00:01:29.270 cryptodev, distributor, dmadev, efd, eventdev, dispatcher, gpudev, gro, 00:01:29.270 gso, ip_frag, jobstats, latencystats, lpm, member, pcapng, power, 00:01:29.270 rawdev, regexdev, mldev, rib, reorder, sched, security, stack, 00:01:29.270 vhost, ipsec, pdcp, fib, port, pdump, table, pipeline, 00:01:29.270 graph, node, 00:01:29.270 00:01:29.270 Message: 00:01:29.270 =============== 00:01:29.270 Drivers Enabled 00:01:29.270 =============== 00:01:29.270 00:01:29.270 common: 00:01:29.270 00:01:29.270 bus: 00:01:29.270 pci, vdev, 00:01:29.270 mempool: 00:01:29.270 ring, 00:01:29.270 dma: 00:01:29.270 00:01:29.270 net: 00:01:29.270 i40e, 00:01:29.270 raw: 00:01:29.270 00:01:29.270 crypto: 00:01:29.270 00:01:29.270 compress: 00:01:29.270 00:01:29.270 regex: 00:01:29.270 00:01:29.270 ml: 00:01:29.270 00:01:29.270 vdpa: 00:01:29.270 00:01:29.270 event: 00:01:29.270 00:01:29.270 baseband: 00:01:29.270 00:01:29.270 gpu: 00:01:29.270 00:01:29.270 power: 00:01:29.270 acpi, amd_pstate, cppc, intel_pstate, intel_uncore, kvm_vm, 00:01:29.270 00:01:29.270 Message: 00:01:29.270 ================= 00:01:29.270 Content Skipped 00:01:29.270 ================= 00:01:29.270 00:01:29.270 apps: 00:01:29.270 00:01:29.270 libs: 00:01:29.270 00:01:29.270 drivers: 00:01:29.270 common/cpt: not in enabled drivers build config 00:01:29.270 common/dpaax: not in enabled drivers build config 00:01:29.270 common/iavf: not in enabled drivers build config 00:01:29.270 common/idpf: not in enabled drivers build config 00:01:29.270 common/ionic: not in enabled drivers build config 00:01:29.270 common/mvep: not in enabled drivers build config 00:01:29.270 common/octeontx: not in enabled drivers build config 00:01:29.270 bus/auxiliary: not in enabled drivers build config 00:01:29.270 bus/cdx: not in enabled drivers build config 00:01:29.270 bus/dpaa: not in enabled drivers build config 00:01:29.270 bus/fslmc: not in enabled drivers build config 00:01:29.270 bus/ifpga: not in enabled drivers build config 00:01:29.270 bus/platform: not in enabled drivers build config 00:01:29.270 bus/uacce: not in enabled drivers build config 00:01:29.270 bus/vmbus: not in enabled drivers build config 00:01:29.270 common/cnxk: not in enabled drivers build config 00:01:29.270 common/mlx5: not in enabled drivers build config 00:01:29.270 common/nfp: not in enabled drivers build config 00:01:29.270 common/nitrox: not in enabled drivers build config 00:01:29.270 common/qat: not in enabled drivers build config 00:01:29.270 common/sfc_efx: not in enabled drivers build config 00:01:29.270 mempool/bucket: not in enabled drivers build config 00:01:29.270 mempool/cnxk: not in enabled drivers build config 00:01:29.270 mempool/dpaa: not in enabled drivers build config 00:01:29.270 mempool/dpaa2: not in enabled drivers build config 00:01:29.270 mempool/octeontx: not in enabled drivers build config 00:01:29.270 mempool/stack: not in enabled drivers build config 00:01:29.270 dma/cnxk: not in enabled drivers build config 00:01:29.270 dma/dpaa: not in enabled drivers build config 00:01:29.270 dma/dpaa2: not in enabled drivers build config 00:01:29.270 dma/hisilicon: not in enabled drivers build config 00:01:29.270 dma/idxd: not in enabled drivers build config 00:01:29.270 dma/ioat: not in enabled drivers build config 00:01:29.270 dma/odm: not in enabled drivers build config 00:01:29.270 dma/skeleton: not in enabled drivers build config 00:01:29.270 net/af_packet: not in enabled drivers build config 00:01:29.270 net/af_xdp: not in enabled drivers build config 00:01:29.270 net/ark: not in enabled drivers build config 00:01:29.270 net/atlantic: not in enabled drivers build config 00:01:29.270 net/avp: not in enabled drivers build config 00:01:29.270 net/axgbe: not in enabled drivers build config 00:01:29.270 net/bnx2x: not in enabled drivers build config 00:01:29.270 net/bnxt: not in enabled drivers build config 00:01:29.270 net/bonding: not in enabled drivers build config 00:01:29.270 net/cnxk: not in enabled drivers build config 00:01:29.270 net/cpfl: not in enabled drivers build config 00:01:29.270 net/cxgbe: not in enabled drivers build config 00:01:29.270 net/dpaa: not in enabled drivers build config 00:01:29.270 net/dpaa2: not in enabled drivers build config 00:01:29.271 net/e1000: not in enabled drivers build config 00:01:29.271 net/ena: not in enabled drivers build config 00:01:29.271 net/enetc: not in enabled drivers build config 00:01:29.271 net/enetfec: not in enabled drivers build config 00:01:29.271 net/enic: not in enabled drivers build config 00:01:29.271 net/failsafe: not in enabled drivers build config 00:01:29.271 net/fm10k: not in enabled drivers build config 00:01:29.271 net/gve: not in enabled drivers build config 00:01:29.271 net/hinic: not in enabled drivers build config 00:01:29.271 net/hns3: not in enabled drivers build config 00:01:29.271 net/iavf: not in enabled drivers build config 00:01:29.271 net/ice: not in enabled drivers build config 00:01:29.271 net/idpf: not in enabled drivers build config 00:01:29.271 net/igc: not in enabled drivers build config 00:01:29.271 net/ionic: not in enabled drivers build config 00:01:29.271 net/ipn3ke: not in enabled drivers build config 00:01:29.271 net/ixgbe: not in enabled drivers build config 00:01:29.271 net/mana: not in enabled drivers build config 00:01:29.271 net/memif: not in enabled drivers build config 00:01:29.271 net/mlx4: not in enabled drivers build config 00:01:29.271 net/mlx5: not in enabled drivers build config 00:01:29.271 net/mvneta: not in enabled drivers build config 00:01:29.271 net/mvpp2: not in enabled drivers build config 00:01:29.271 net/netvsc: not in enabled drivers build config 00:01:29.271 net/nfb: not in enabled drivers build config 00:01:29.271 net/nfp: not in enabled drivers build config 00:01:29.271 net/ngbe: not in enabled drivers build config 00:01:29.271 net/ntnic: not in enabled drivers build config 00:01:29.271 net/null: not in enabled drivers build config 00:01:29.271 net/octeontx: not in enabled drivers build config 00:01:29.271 net/octeon_ep: not in enabled drivers build config 00:01:29.271 net/pcap: not in enabled drivers build config 00:01:29.271 net/pfe: not in enabled drivers build config 00:01:29.271 net/qede: not in enabled drivers build config 00:01:29.271 net/r8169: not in enabled drivers build config 00:01:29.271 net/ring: not in enabled drivers build config 00:01:29.271 net/sfc: not in enabled drivers build config 00:01:29.271 net/softnic: not in enabled drivers build config 00:01:29.271 net/tap: not in enabled drivers build config 00:01:29.271 net/thunderx: not in enabled drivers build config 00:01:29.271 net/txgbe: not in enabled drivers build config 00:01:29.271 net/vdev_netvsc: not in enabled drivers build config 00:01:29.271 net/vhost: not in enabled drivers build config 00:01:29.271 net/virtio: not in enabled drivers build config 00:01:29.271 net/vmxnet3: not in enabled drivers build config 00:01:29.271 net/zxdh: not in enabled drivers build config 00:01:29.271 raw/cnxk_bphy: not in enabled drivers build config 00:01:29.271 raw/cnxk_gpio: not in enabled drivers build config 00:01:29.271 raw/cnxk_rvu_lf: not in enabled drivers build config 00:01:29.271 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:29.271 raw/gdtc: not in enabled drivers build config 00:01:29.271 raw/ifpga: not in enabled drivers build config 00:01:29.271 raw/ntb: not in enabled drivers build config 00:01:29.271 raw/skeleton: not in enabled drivers build config 00:01:29.271 crypto/armv8: not in enabled drivers build config 00:01:29.271 crypto/bcmfs: not in enabled drivers build config 00:01:29.271 crypto/caam_jr: not in enabled drivers build config 00:01:29.271 crypto/ccp: not in enabled drivers build config 00:01:29.271 crypto/cnxk: not in enabled drivers build config 00:01:29.271 crypto/dpaa_sec: not in enabled drivers build config 00:01:29.271 crypto/dpaa2_sec: not in enabled drivers build config 00:01:29.271 crypto/ionic: not in enabled drivers build config 00:01:29.271 crypto/ipsec_mb: not in enabled drivers build config 00:01:29.271 crypto/mlx5: not in enabled drivers build config 00:01:29.271 crypto/mvsam: not in enabled drivers build config 00:01:29.271 crypto/nitrox: not in enabled drivers build config 00:01:29.271 crypto/null: not in enabled drivers build config 00:01:29.271 crypto/octeontx: not in enabled drivers build config 00:01:29.271 crypto/openssl: not in enabled drivers build config 00:01:29.271 crypto/scheduler: not in enabled drivers build config 00:01:29.271 crypto/uadk: not in enabled drivers build config 00:01:29.271 crypto/virtio: not in enabled drivers build config 00:01:29.271 compress/isal: not in enabled drivers build config 00:01:29.271 compress/mlx5: not in enabled drivers build config 00:01:29.271 compress/nitrox: not in enabled drivers build config 00:01:29.271 compress/octeontx: not in enabled drivers build config 00:01:29.271 compress/uadk: not in enabled drivers build config 00:01:29.271 compress/zlib: not in enabled drivers build config 00:01:29.271 regex/mlx5: not in enabled drivers build config 00:01:29.271 regex/cn9k: not in enabled drivers build config 00:01:29.271 ml/cnxk: not in enabled drivers build config 00:01:29.271 vdpa/ifc: not in enabled drivers build config 00:01:29.271 vdpa/mlx5: not in enabled drivers build config 00:01:29.271 vdpa/nfp: not in enabled drivers build config 00:01:29.271 vdpa/sfc: not in enabled drivers build config 00:01:29.271 event/cnxk: not in enabled drivers build config 00:01:29.271 event/dlb2: not in enabled drivers build config 00:01:29.271 event/dpaa: not in enabled drivers build config 00:01:29.271 event/dpaa2: not in enabled drivers build config 00:01:29.271 event/dsw: not in enabled drivers build config 00:01:29.271 event/opdl: not in enabled drivers build config 00:01:29.271 event/skeleton: not in enabled drivers build config 00:01:29.271 event/sw: not in enabled drivers build config 00:01:29.271 event/octeontx: not in enabled drivers build config 00:01:29.271 baseband/acc: not in enabled drivers build config 00:01:29.271 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:29.271 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:29.271 baseband/la12xx: not in enabled drivers build config 00:01:29.271 baseband/null: not in enabled drivers build config 00:01:29.271 baseband/turbo_sw: not in enabled drivers build config 00:01:29.271 gpu/cuda: not in enabled drivers build config 00:01:29.271 power/amd_uncore: not in enabled drivers build config 00:01:29.271 00:01:29.271 00:01:29.271 Message: DPDK build config complete: 00:01:29.271 source path = "/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk" 00:01:29.271 build path = "/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp" 00:01:29.271 Build targets in project: 246 00:01:29.271 00:01:29.271 DPDK 24.11.0-rc4 00:01:29.271 00:01:29.271 User defined options 00:01:29.271 libdir : lib 00:01:29.271 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:29.271 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:29.271 c_link_args : 00:01:29.271 enable_docs : false 00:01:29.837 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:01:29.837 enable_kmods : false 00:01:29.837 machine : native 00:01:29.837 tests : false 00:01:29.837 00:01:29.837 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:29.837 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:29.837 15:38:37 build_native_dpdk -- common/autobuild_common.sh@199 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:30.103 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:30.103 [1/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:30.103 [2/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:30.103 [3/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:30.103 [4/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:30.103 [5/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:30.365 [6/766] Compiling C object lib/librte_log.a.p/log_log_syslog.c.o 00:01:30.365 [7/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:30.365 [8/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:30.365 [9/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:30.365 [10/766] Compiling C object lib/librte_log.a.p/log_log_color.c.o 00:01:30.365 [11/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:30.365 [12/766] Compiling C object lib/librte_log.a.p/log_log_timestamp.c.o 00:01:30.365 [13/766] Compiling C object lib/librte_log.a.p/log_log_journal.c.o 00:01:30.365 [14/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:30.365 [15/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:30.365 [16/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:30.365 [17/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:30.365 [18/766] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:30.365 [19/766] Linking static target lib/librte_kvargs.a 00:01:30.365 [20/766] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:30.365 [21/766] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:30.365 [22/766] Linking static target lib/librte_pci.a 00:01:30.365 [23/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:30.365 [24/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:30.365 [25/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:30.365 [26/766] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:30.365 [27/766] Linking static target lib/librte_log.a 00:01:30.633 [28/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:30.633 [29/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:30.633 [30/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:30.633 [31/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:30.633 [32/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:30.633 [33/766] Compiling C object lib/librte_argparse.a.p/argparse_rte_argparse.c.o 00:01:30.633 [34/766] Linking static target lib/librte_argparse.a 00:01:30.633 [35/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:30.633 [36/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_bitset.c.o 00:01:30.894 [37/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:30.894 [38/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:30.894 [39/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:30.894 [40/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:30.894 [41/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:30.894 [42/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:30.894 [43/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:30.894 [44/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:30.894 [45/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore_var.c.o 00:01:30.894 [46/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:30.894 [47/766] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:30.894 [48/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:30.894 [49/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_mmu.c.o 00:01:30.894 [50/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:30.894 [51/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:30.894 [52/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:30.894 [53/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:30.894 [54/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:30.894 [55/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:30.894 [56/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:30.894 [57/766] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:30.894 [58/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:30.894 [59/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:30.894 [60/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:30.894 [61/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:30.894 [62/766] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:30.894 [63/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:30.894 [64/766] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:30.894 [65/766] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:30.894 [66/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:30.894 [67/766] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:30.894 [68/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:30.894 [69/766] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:30.894 [70/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:30.894 [71/766] Linking static target lib/librte_meter.a 00:01:30.894 [72/766] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:30.894 [73/766] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:30.894 [74/766] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:30.894 [75/766] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:30.894 [76/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:30.894 [77/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:30.894 [78/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:30.894 [79/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:30.894 [80/766] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:01:30.894 [81/766] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:30.894 [82/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:30.895 [83/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:30.895 [84/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:30.895 [85/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:30.895 [86/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:30.895 [87/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:30.895 [88/766] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:30.895 [89/766] Generating lib/argparse.sym_chk with a custom command (wrapped by meson to capture output) 00:01:30.895 [90/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:01:30.895 [91/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:30.895 [92/766] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:30.895 [93/766] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:30.895 [94/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:30.895 [95/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:30.895 [96/766] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:01:30.895 [97/766] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:30.895 [98/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:30.895 [99/766] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:31.156 [100/766] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:31.156 [101/766] Linking static target lib/librte_cmdline.a 00:01:31.156 [102/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:31.156 [103/766] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:31.156 [104/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:31.156 [105/766] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:31.156 [106/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:31.156 [107/766] Linking static target lib/librte_ring.a 00:01:31.156 [108/766] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:31.156 [109/766] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:31.156 [110/766] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:31.156 [111/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:31.156 [112/766] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:31.156 [113/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:31.156 [114/766] Linking static target lib/librte_metrics.a 00:01:31.156 [115/766] Linking static target lib/librte_net.a 00:01:31.156 [116/766] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:31.156 [117/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:31.156 [118/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:31.156 [119/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:31.156 [120/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:31.156 [121/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:31.156 [122/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:31.156 [123/766] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:31.156 [124/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:31.156 [125/766] Linking static target lib/librte_cfgfile.a 00:01:31.156 [126/766] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:31.156 [127/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:31.421 [128/766] Compiling C object lib/librte_power.a.p/power_rte_power_qos.c.o 00:01:31.421 [129/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:31.421 [130/766] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:31.421 [131/766] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:31.421 [132/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:31.421 [133/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:31.421 [134/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:31.421 [135/766] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gf2_poly_math.c.o 00:01:31.421 [136/766] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:31.421 [137/766] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:31.421 [138/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:31.421 [139/766] Linking target lib/librte_log.so.25.0 00:01:31.421 [140/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:31.421 [141/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:31.421 [142/766] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:31.421 [143/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:31.421 [144/766] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:31.421 [145/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:31.421 [146/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:31.421 [147/766] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:31.421 [148/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:31.421 [149/766] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:31.421 [150/766] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:31.421 [151/766] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:31.421 [152/766] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:31.684 [153/766] Linking static target lib/librte_bitratestats.a 00:01:31.684 [154/766] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:31.684 [155/766] Linking static target lib/librte_mempool.a 00:01:31.684 [156/766] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:31.684 [157/766] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:31.684 [158/766] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:31.684 [159/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:31.684 [160/766] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:31.684 [161/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:31.684 [162/766] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:31.684 [163/766] Linking static target lib/librte_timer.a 00:01:31.684 [164/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:31.684 [165/766] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:31.684 [166/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:31.684 [167/766] Linking static target lib/librte_jobstats.a 00:01:31.684 [168/766] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:31.684 [169/766] Generating symbol file lib/librte_log.so.25.0.p/librte_log.so.25.0.symbols 00:01:31.684 [170/766] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:31.684 [171/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:31.684 [172/766] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:31.684 [173/766] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:31.684 [174/766] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:31.684 [175/766] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:31.684 [176/766] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:31.684 [177/766] Linking target lib/librte_kvargs.so.25.0 00:01:31.684 [178/766] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:31.684 [179/766] Linking target lib/librte_argparse.so.25.0 00:01:31.684 [180/766] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:31.684 [181/766] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:31.684 [182/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:31.684 [183/766] Compiling C object lib/librte_power.a.p/power_rte_power_cpufreq.c.o 00:01:31.684 [184/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:31.684 [185/766] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:31.684 [186/766] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:31.684 [187/766] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:31.684 [188/766] Linking static target lib/librte_compressdev.a 00:01:31.945 [189/766] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:31.945 [190/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:31.945 [191/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:31.945 [192/766] Compiling C object lib/librte_port.a.p/port_port_log.c.o 00:01:31.945 [193/766] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:31.945 [194/766] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:31.945 [195/766] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:31.945 [196/766] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:31.945 [197/766] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:31.945 [198/766] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:31.945 [199/766] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:31.945 [200/766] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:31.945 [201/766] Linking static target lib/librte_rcu.a 00:01:31.945 [202/766] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:31.945 [203/766] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:31.945 [204/766] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:31.945 [205/766] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:31.945 [206/766] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:31.945 [207/766] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:31.945 [208/766] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:31.945 [209/766] Generating symbol file lib/librte_kvargs.so.25.0.p/librte_kvargs.so.25.0.symbols 00:01:31.945 [210/766] Linking static target lib/librte_dispatcher.a 00:01:31.945 [211/766] Linking static target lib/librte_latencystats.a 00:01:31.945 [212/766] Linking static target lib/librte_telemetry.a 00:01:31.945 [213/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:31.945 [214/766] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:31.945 [215/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:31.945 [216/766] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:31.945 [217/766] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:31.945 [218/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:31.945 [219/766] Linking static target lib/librte_eal.a 00:01:31.945 [220/766] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:31.945 [221/766] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:31.945 [222/766] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:31.945 [223/766] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:31.945 [224/766] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:31.945 [225/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:31.945 [226/766] Linking static target lib/librte_gpudev.a 00:01:31.945 [227/766] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:31.945 [228/766] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:31.945 [229/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:31.945 [230/766] Linking static target lib/librte_gro.a 00:01:32.207 [231/766] Linking static target lib/librte_stack.a 00:01:32.207 [232/766] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:32.207 [233/766] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:32.207 [234/766] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:32.207 [235/766] Linking static target lib/librte_power.a 00:01:32.207 [236/766] Linking static target lib/librte_regexdev.a 00:01:32.207 [237/766] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:32.207 [238/766] Linking static target lib/librte_bbdev.a 00:01:32.207 [239/766] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:32.207 [240/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:32.207 [241/766] Linking static target lib/librte_distributor.a 00:01:32.207 [242/766] Linking static target lib/librte_gso.a 00:01:32.207 [243/766] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:32.207 [244/766] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:32.207 [245/766] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:32.207 [246/766] Compiling C object lib/librte_table.a.p/table_table_log.c.o 00:01:32.207 [247/766] Linking static target lib/librte_rawdev.a 00:01:32.207 [248/766] Linking static target lib/librte_mbuf.a 00:01:32.207 [249/766] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:32.207 [250/766] Linking static target lib/librte_dmadev.a 00:01:32.207 [251/766] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:32.207 [252/766] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.208 [253/766] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.208 [254/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:32.208 [255/766] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:32.208 [256/766] Linking static target lib/librte_ip_frag.a 00:01:32.208 [257/766] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:32.208 [258/766] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:32.208 [259/766] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:32.208 [260/766] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:32.208 [261/766] Linking static target lib/librte_pcapng.a 00:01:32.208 [262/766] Linking static target lib/librte_mldev.a 00:01:32.473 [263/766] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:32.473 [264/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:32.473 [265/766] Linking static target lib/librte_reorder.a 00:01:32.473 [266/766] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:32.473 [267/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:32.473 [268/766] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:32.473 [269/766] Linking static target lib/librte_security.a 00:01:32.473 [270/766] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:32.473 [271/766] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.473 [272/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:32.473 [273/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:32.473 [274/766] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.473 [275/766] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:32.473 [276/766] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:32.473 [277/766] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.473 [278/766] Linking static target lib/librte_rib.a 00:01:32.473 [279/766] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:32.473 [280/766] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.473 [281/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:32.473 [282/766] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:32.473 [283/766] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.473 [284/766] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:32.473 [285/766] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:32.473 [286/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:32.473 [287/766] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:32.473 [288/766] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.473 [289/766] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:32.473 [290/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:32.740 [291/766] Linking static target lib/librte_lpm.a 00:01:32.740 [292/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:32.740 [293/766] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:32.740 [294/766] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:32.740 [295/766] Linking static target lib/librte_bpf.a 00:01:32.740 [296/766] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.740 [297/766] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:32.740 [298/766] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:32.740 [299/766] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:32.740 [300/766] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:32.740 [301/766] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.740 [302/766] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:32.740 [303/766] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:32.740 [304/766] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.740 [305/766] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.740 [306/766] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.740 [307/766] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.740 [308/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:32.740 [309/766] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:32.740 [310/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:32.740 [311/766] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:32.740 [312/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:32.740 [313/766] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:32.740 [314/766] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.740 [315/766] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:32.740 [316/766] Linking target lib/librte_telemetry.so.25.0 00:01:32.740 [317/766] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:33.000 [318/766] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:33.000 [319/766] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:33.000 [320/766] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:33.000 [321/766] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:33.000 [322/766] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.000 [323/766] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:33.000 [324/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:33.000 [325/766] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:33.000 [326/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:33.000 [327/766] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:33.000 [328/766] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:33.000 [329/766] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:33.000 [330/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:33.000 [331/766] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:33.000 [332/766] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:33.000 [333/766] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:33.000 [334/766] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:33.000 [335/766] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.000 [336/766] Linking static target lib/librte_efd.a 00:01:33.000 [337/766] Generating symbol file lib/librte_telemetry.so.25.0.p/librte_telemetry.so.25.0.symbols 00:01:33.000 [338/766] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.000 [339/766] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:33.000 [340/766] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:33.000 [341/766] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:33.000 [342/766] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:33.259 [343/766] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:33.259 [344/766] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.259 [345/766] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:33.259 [346/766] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.259 [347/766] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.259 [348/766] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.259 [349/766] Compiling C object drivers/libtmp_rte_power_kvm_vm.a.p/power_kvm_vm_kvm_vm.c.o 00:01:33.259 [350/766] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:33.259 [351/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:33.259 [352/766] Compiling C object drivers/libtmp_rte_power_kvm_vm.a.p/power_kvm_vm_guest_channel.c.o 00:01:33.259 [353/766] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:33.259 [354/766] Linking static target drivers/libtmp_rte_power_kvm_vm.a 00:01:33.259 [355/766] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:33.259 [356/766] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:33.259 [357/766] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.259 [358/766] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:33.259 [359/766] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:33.259 [360/766] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:33.259 [361/766] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:33.259 [362/766] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.259 [363/766] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:33.259 [364/766] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:33.259 [365/766] Generating app/graph/commands_hdr with a custom command (wrapped by meson to capture output) 00:01:33.525 [366/766] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:33.525 [367/766] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:33.525 [368/766] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:33.525 [369/766] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:33.525 [370/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:33.525 [371/766] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:33.525 [372/766] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:33.525 [373/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:33.525 [374/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:33.525 [375/766] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:33.525 [376/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:33.525 [377/766] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.525 [378/766] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.525 [379/766] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:33.525 [380/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:33.525 [381/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:33.525 [382/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:33.525 [383/766] Generating drivers/rte_power_kvm_vm.pmd.c with a custom command 00:01:33.525 [384/766] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:33.525 [385/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:33.525 [386/766] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:33.526 [387/766] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:33.526 [388/766] Compiling C object drivers/librte_power_kvm_vm.a.p/meson-generated_.._rte_power_kvm_vm.pmd.c.o 00:01:33.526 [389/766] Linking static target lib/librte_pdump.a 00:01:33.526 [390/766] Compiling C object drivers/librte_power_kvm_vm.so.25.0.p/meson-generated_.._rte_power_kvm_vm.pmd.c.o 00:01:33.526 [391/766] Linking static target drivers/librte_power_kvm_vm.a 00:01:33.526 [392/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:33.526 [393/766] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:33.526 [394/766] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:33.783 [395/766] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.783 [396/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:33.783 [397/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:33.783 [398/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:33.783 [399/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:33.783 [400/766] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.783 [401/766] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:33.783 [402/766] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:33.783 [403/766] Linking static target lib/librte_graph.a 00:01:33.783 [404/766] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:33.783 [405/766] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:33.783 [406/766] Compiling C object drivers/libtmp_rte_power_acpi.a.p/power_acpi_acpi_cpufreq.c.o 00:01:33.783 [407/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:33.783 [408/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:33.783 [409/766] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:34.049 [410/766] Linking static target drivers/libtmp_rte_power_acpi.a 00:01:34.049 [411/766] Compiling C object drivers/libtmp_rte_power_intel_uncore.a.p/power_intel_uncore_intel_uncore.c.o 00:01:34.049 [412/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:34.049 [413/766] Linking static target drivers/libtmp_rte_power_intel_uncore.a 00:01:34.049 [414/766] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:34.049 [415/766] Compiling C object drivers/libtmp_rte_power_amd_pstate.a.p/power_amd_pstate_amd_pstate_cpufreq.c.o 00:01:34.049 [416/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:34.049 [417/766] Linking static target drivers/libtmp_rte_power_amd_pstate.a 00:01:34.049 [418/766] Generating drivers/rte_power_kvm_vm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.049 [419/766] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:34.049 [420/766] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:34.049 [421/766] Compiling C object drivers/libtmp_rte_power_cppc.a.p/power_cppc_cppc_cpufreq.c.o 00:01:34.049 [422/766] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:34.049 [423/766] Linking static target lib/librte_sched.a 00:01:34.049 [424/766] Linking static target drivers/libtmp_rte_power_cppc.a 00:01:34.049 [425/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:34.049 [426/766] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:34.049 [427/766] Linking static target lib/librte_table.a 00:01:34.049 [428/766] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:34.049 [429/766] Compiling C object drivers/librte_bus_vdev.so.25.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:34.049 [430/766] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:34.049 [431/766] Linking static target drivers/librte_bus_vdev.a 00:01:34.049 [432/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:34.049 [433/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:34.049 [434/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:34.049 [435/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:34.049 [436/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:34.049 [437/766] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.049 [438/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:34.049 [439/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:34.049 [440/766] Compiling C object drivers/libtmp_rte_power_intel_pstate.a.p/power_intel_pstate_intel_pstate_cpufreq.c.o 00:01:34.049 [441/766] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:34.049 [442/766] Linking static target drivers/libtmp_rte_power_intel_pstate.a 00:01:34.049 [443/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:34.049 [444/766] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:34.049 [445/766] Linking static target lib/librte_fib.a 00:01:34.308 [446/766] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:34.308 [447/766] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:34.308 [448/766] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:34.308 [449/766] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:34.308 [450/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:34.308 [451/766] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:34.308 [452/766] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:34.308 [453/766] Generating drivers/rte_power_acpi.pmd.c with a custom command 00:01:34.308 [454/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:34.308 [455/766] Compiling C object app/dpdk-graph.p/graph_l2fwd.c.o 00:01:34.308 [456/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:34.308 [457/766] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:34.308 [458/766] Compiling C object drivers/librte_power_acpi.a.p/meson-generated_.._rte_power_acpi.pmd.c.o 00:01:34.308 [459/766] Compiling C object drivers/librte_power_acpi.so.25.0.p/meson-generated_.._rte_power_acpi.pmd.c.o 00:01:34.308 [460/766] Generating drivers/rte_power_intel_uncore.pmd.c with a custom command 00:01:34.308 [461/766] Linking static target drivers/librte_power_acpi.a 00:01:34.308 [462/766] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:34.308 [463/766] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:34.308 [464/766] Generating drivers/rte_power_amd_pstate.pmd.c with a custom command 00:01:34.308 [465/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:34.308 [466/766] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:34.308 [467/766] Compiling C object drivers/librte_power_intel_uncore.so.25.0.p/meson-generated_.._rte_power_intel_uncore.pmd.c.o 00:01:34.308 [468/766] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:34.308 [469/766] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:34.308 [470/766] Compiling C object drivers/librte_power_intel_uncore.a.p/meson-generated_.._rte_power_intel_uncore.pmd.c.o 00:01:34.308 [471/766] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:34.308 [472/766] Compiling C object drivers/librte_power_amd_pstate.a.p/meson-generated_.._rte_power_amd_pstate.pmd.c.o 00:01:34.308 [473/766] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:34.308 [474/766] Linking static target drivers/librte_power_intel_uncore.a 00:01:34.308 [475/766] Compiling C object drivers/librte_power_amd_pstate.so.25.0.p/meson-generated_.._rte_power_amd_pstate.pmd.c.o 00:01:34.308 [476/766] Linking static target lib/librte_cryptodev.a 00:01:34.308 [477/766] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:34.308 [478/766] Linking static target drivers/librte_power_amd_pstate.a 00:01:34.308 [479/766] Compiling C object drivers/librte_bus_pci.so.25.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:34.308 [480/766] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:34.308 [481/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:34.308 [482/766] Linking static target drivers/librte_bus_pci.a 00:01:34.308 [483/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:34.308 [484/766] Linking static target lib/librte_member.a 00:01:34.308 [485/766] Generating drivers/rte_power_cppc.pmd.c with a custom command 00:01:34.568 [486/766] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:34.568 [487/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:34.568 [488/766] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:34.568 [489/766] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:34.568 [490/766] Compiling C object drivers/librte_power_cppc.a.p/meson-generated_.._rte_power_cppc.pmd.c.o 00:01:34.568 [491/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:34.568 [492/766] Compiling C object drivers/librte_power_cppc.so.25.0.p/meson-generated_.._rte_power_cppc.pmd.c.o 00:01:34.568 [493/766] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:34.568 [494/766] Linking static target drivers/librte_power_cppc.a 00:01:34.568 [495/766] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:34.568 [496/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:34.568 [497/766] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.568 [498/766] Generating drivers/rte_power_intel_pstate.pmd.c with a custom command 00:01:34.568 [499/766] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:34.568 [500/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:34.568 [501/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:34.568 [502/766] Compiling C object drivers/librte_power_intel_pstate.a.p/meson-generated_.._rte_power_intel_pstate.pmd.c.o 00:01:34.569 [503/766] Compiling C object drivers/librte_power_intel_pstate.so.25.0.p/meson-generated_.._rte_power_intel_pstate.pmd.c.o 00:01:34.569 [504/766] Linking static target drivers/librte_power_intel_pstate.a 00:01:34.569 [505/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:34.569 [506/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:34.569 [507/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:34.569 [508/766] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:34.569 [509/766] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:34.569 [510/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:34.569 [511/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:34.569 [512/766] Linking static target lib/librte_node.a 00:01:34.569 [513/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:34.569 [514/766] Linking static target lib/librte_pdcp.a 00:01:34.569 [515/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:34.828 [516/766] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:34.828 [517/766] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:34.828 [518/766] Linking static target lib/librte_ipsec.a 00:01:34.828 [519/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:34.828 [520/766] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.828 [521/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:34.828 [522/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:34.828 [523/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:34.828 [524/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:34.828 [525/766] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:34.828 [526/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:34.828 [527/766] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:34.828 [528/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:34.828 [529/766] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:34.828 [530/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:34.828 [531/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:34.828 [532/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:34.828 [533/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:34.828 [534/766] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.828 [535/766] Linking static target lib/librte_hash.a 00:01:34.828 [536/766] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:34.828 [537/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:34.828 [538/766] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:34.828 [539/766] Compiling C object drivers/librte_mempool_ring.so.25.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:34.828 [540/766] Linking static target drivers/librte_mempool_ring.a 00:01:34.828 [541/766] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:34.828 [542/766] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:34.828 [543/766] Linking static target lib/librte_port.a 00:01:34.828 [544/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:34.828 [545/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:34.828 [546/766] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.828 [547/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:34.828 [548/766] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.828 [549/766] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.087 [550/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:35.087 [551/766] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:35.087 [552/766] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:35.087 [553/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:35.087 [554/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:35.087 [555/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:35.087 [556/766] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:35.087 [557/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:35.087 [558/766] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:35.087 [559/766] Linking static target lib/acl/libavx2_tmp.a 00:01:35.087 [560/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:35.087 [561/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:35.087 [562/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:35.087 [563/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:35.087 [564/766] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:35.087 [565/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:35.087 [566/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:35.087 [567/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:35.087 [568/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:35.087 [569/766] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.087 [570/766] Linking static target lib/librte_eventdev.a 00:01:35.087 [571/766] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.087 [572/766] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.087 [573/766] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.087 [574/766] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:35.087 [575/766] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.087 [576/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:35.087 [577/766] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:35.087 [578/766] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:35.087 [579/766] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:35.087 [580/766] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:35.346 [581/766] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:35.346 [582/766] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:35.346 [583/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:35.346 [584/766] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:35.346 [585/766] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:35.346 [586/766] Compiling C object app/dpdk-testpmd.p/test-pmd_hairpin.c.o 00:01:35.346 [587/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:35.346 [588/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:35.346 [589/766] Compiling C object app/dpdk-test-security-perf.p/test_test_security_proto.c.o 00:01:35.346 [590/766] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:35.346 [591/766] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:35.346 [592/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:35.346 [593/766] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:35.346 [594/766] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:35.346 [595/766] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:35.346 [596/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:35.346 [597/766] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:35.346 [598/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:35.605 [599/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:35.605 [600/766] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:35.605 [601/766] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:35.605 [602/766] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:35.605 [603/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:35.605 [604/766] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:35.605 [605/766] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:35.605 [606/766] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:35.605 [607/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:35.605 [608/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:35.605 [609/766] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:35.605 [610/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:35.605 [611/766] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:35.605 [612/766] Linking static target lib/librte_acl.a 00:01:35.605 [613/766] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.605 [614/766] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:35.605 [615/766] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:35.605 [616/766] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:35.605 [617/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:35.862 [618/766] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.862 [619/766] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:35.862 [620/766] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:36.120 [621/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:01:36.120 [622/766] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:36.120 [623/766] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.120 [624/766] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:36.120 [625/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:36.378 [626/766] Linking static target lib/librte_ethdev.a 00:01:36.378 [627/766] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.636 [628/766] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:36.636 [629/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:36.894 [630/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:37.153 [631/766] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:37.153 [632/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:37.717 [633/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:37.717 [634/766] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:37.717 [635/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:37.976 [636/766] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:37.976 [637/766] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:37.976 [638/766] Compiling C object drivers/librte_net_i40e.so.25.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:37.976 [639/766] Linking static target drivers/librte_net_i40e.a 00:01:38.545 [640/766] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:38.804 [641/766] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.062 [642/766] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.062 [643/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:39.321 [644/766] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:44.595 [645/766] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.595 [646/766] Linking target lib/librte_eal.so.25.0 00:01:44.595 [647/766] Generating symbol file lib/librte_eal.so.25.0.p/librte_eal.so.25.0.symbols 00:01:44.595 [648/766] Linking target lib/librte_stack.so.25.0 00:01:44.595 [649/766] Linking target lib/librte_cfgfile.so.25.0 00:01:44.595 [650/766] Linking target lib/librte_meter.so.25.0 00:01:44.595 [651/766] Linking target lib/librte_timer.so.25.0 00:01:44.595 [652/766] Linking target lib/librte_ring.so.25.0 00:01:44.595 [653/766] Linking target lib/librte_dmadev.so.25.0 00:01:44.595 [654/766] Linking target lib/librte_pci.so.25.0 00:01:44.595 [655/766] Linking target lib/librte_jobstats.so.25.0 00:01:44.595 [656/766] Linking target lib/librte_acl.so.25.0 00:01:44.595 [657/766] Linking target lib/librte_rawdev.so.25.0 00:01:44.595 [658/766] Linking target drivers/librte_bus_vdev.so.25.0 00:01:44.855 [659/766] Generating symbol file lib/librte_meter.so.25.0.p/librte_meter.so.25.0.symbols 00:01:44.855 [660/766] Generating symbol file lib/librte_ring.so.25.0.p/librte_ring.so.25.0.symbols 00:01:44.855 [661/766] Generating symbol file lib/librte_timer.so.25.0.p/librte_timer.so.25.0.symbols 00:01:44.855 [662/766] Generating symbol file drivers/librte_bus_vdev.so.25.0.p/librte_bus_vdev.so.25.0.symbols 00:01:44.855 [663/766] Generating symbol file lib/librte_acl.so.25.0.p/librte_acl.so.25.0.symbols 00:01:44.855 [664/766] Generating symbol file lib/librte_dmadev.so.25.0.p/librte_dmadev.so.25.0.symbols 00:01:44.855 [665/766] Generating symbol file lib/librte_pci.so.25.0.p/librte_pci.so.25.0.symbols 00:01:44.855 [666/766] Linking target lib/librte_rcu.so.25.0 00:01:44.855 [667/766] Linking target drivers/librte_bus_pci.so.25.0 00:01:44.855 [668/766] Linking target lib/librte_mempool.so.25.0 00:01:45.113 [669/766] Generating symbol file drivers/librte_bus_pci.so.25.0.p/librte_bus_pci.so.25.0.symbols 00:01:45.113 [670/766] Generating symbol file lib/librte_rcu.so.25.0.p/librte_rcu.so.25.0.symbols 00:01:45.113 [671/766] Generating symbol file lib/librte_mempool.so.25.0.p/librte_mempool.so.25.0.symbols 00:01:45.113 [672/766] Linking target drivers/librte_mempool_ring.so.25.0 00:01:45.113 [673/766] Linking target lib/librte_mbuf.so.25.0 00:01:45.113 [674/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:45.113 [675/766] Linking static target lib/librte_pipeline.a 00:01:45.113 [676/766] Generating symbol file lib/librte_mbuf.so.25.0.p/librte_mbuf.so.25.0.symbols 00:01:45.113 [677/766] Linking target lib/librte_reorder.so.25.0 00:01:45.113 [678/766] Linking target lib/librte_cryptodev.so.25.0 00:01:45.113 [679/766] Linking target lib/librte_distributor.so.25.0 00:01:45.113 [680/766] Linking target lib/librte_gpudev.so.25.0 00:01:45.113 [681/766] Linking target lib/librte_compressdev.so.25.0 00:01:45.113 [682/766] Linking target lib/librte_bbdev.so.25.0 00:01:45.113 [683/766] Linking target lib/librte_net.so.25.0 00:01:45.113 [684/766] Linking target lib/librte_regexdev.so.25.0 00:01:45.113 [685/766] Linking target lib/librte_mldev.so.25.0 00:01:45.113 [686/766] Linking target lib/librte_sched.so.25.0 00:01:45.372 [687/766] Generating symbol file lib/librte_reorder.so.25.0.p/librte_reorder.so.25.0.symbols 00:01:45.372 [688/766] Generating symbol file lib/librte_cryptodev.so.25.0.p/librte_cryptodev.so.25.0.symbols 00:01:45.372 [689/766] Generating symbol file lib/librte_sched.so.25.0.p/librte_sched.so.25.0.symbols 00:01:45.372 [690/766] Generating symbol file lib/librte_net.so.25.0.p/librte_net.so.25.0.symbols 00:01:45.372 [691/766] Linking target lib/librte_rib.so.25.0 00:01:45.372 [692/766] Linking target lib/librte_security.so.25.0 00:01:45.372 [693/766] Linking target lib/librte_cmdline.so.25.0 00:01:45.372 [694/766] Linking target lib/librte_hash.so.25.0 00:01:45.654 [695/766] Generating symbol file lib/librte_hash.so.25.0.p/librte_hash.so.25.0.symbols 00:01:45.654 [696/766] Generating symbol file lib/librte_rib.so.25.0.p/librte_rib.so.25.0.symbols 00:01:45.654 [697/766] Generating symbol file lib/librte_security.so.25.0.p/librte_security.so.25.0.symbols 00:01:45.654 [698/766] Linking target lib/librte_lpm.so.25.0 00:01:45.654 [699/766] Linking target lib/librte_efd.so.25.0 00:01:45.654 [700/766] Linking target lib/librte_member.so.25.0 00:01:45.654 [701/766] Linking target lib/librte_fib.so.25.0 00:01:45.655 [702/766] Linking target lib/librte_ipsec.so.25.0 00:01:45.655 [703/766] Linking target lib/librte_pdcp.so.25.0 00:01:45.655 [704/766] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.655 [705/766] Generating symbol file lib/librte_lpm.so.25.0.p/librte_lpm.so.25.0.symbols 00:01:45.655 [706/766] Generating symbol file lib/librte_ipsec.so.25.0.p/librte_ipsec.so.25.0.symbols 00:01:45.912 [707/766] Linking target lib/librte_ethdev.so.25.0 00:01:45.912 [708/766] Generating symbol file lib/librte_ethdev.so.25.0.p/librte_ethdev.so.25.0.symbols 00:01:45.912 [709/766] Linking target lib/librte_gso.so.25.0 00:01:45.912 [710/766] Linking target lib/librte_gro.so.25.0 00:01:45.912 [711/766] Linking target lib/librte_metrics.so.25.0 00:01:45.912 [712/766] Linking target lib/librte_pcapng.so.25.0 00:01:45.912 [713/766] Linking target lib/librte_ip_frag.so.25.0 00:01:45.912 [714/766] Linking target lib/librte_power.so.25.0 00:01:45.912 [715/766] Linking target lib/librte_bpf.so.25.0 00:01:45.912 [716/766] Linking target lib/librte_eventdev.so.25.0 00:01:46.172 [717/766] Linking target drivers/librte_net_i40e.so.25.0 00:01:46.172 [718/766] Generating symbol file lib/librte_eventdev.so.25.0.p/librte_eventdev.so.25.0.symbols 00:01:46.172 [719/766] Generating symbol file lib/librte_metrics.so.25.0.p/librte_metrics.so.25.0.symbols 00:01:46.172 [720/766] Generating symbol file lib/librte_ip_frag.so.25.0.p/librte_ip_frag.so.25.0.symbols 00:01:46.172 [721/766] Generating symbol file lib/librte_pcapng.so.25.0.p/librte_pcapng.so.25.0.symbols 00:01:46.172 [722/766] Generating symbol file lib/librte_bpf.so.25.0.p/librte_bpf.so.25.0.symbols 00:01:46.172 [723/766] Generating symbol file lib/librte_power.so.25.0.p/librte_power.so.25.0.symbols 00:01:46.172 [724/766] Linking target lib/librte_bitratestats.so.25.0 00:01:46.172 [725/766] Linking target lib/librte_latencystats.so.25.0 00:01:46.172 [726/766] Linking target lib/librte_dispatcher.so.25.0 00:01:46.172 [727/766] Linking target lib/librte_pdump.so.25.0 00:01:46.172 [728/766] Linking target lib/librte_graph.so.25.0 00:01:46.172 [729/766] Linking target lib/librte_port.so.25.0 00:01:46.172 [730/766] Linking target drivers/librte_power_amd_pstate.so.25.0 00:01:46.172 [731/766] Linking target drivers/librte_power_intel_pstate.so.25.0 00:01:46.172 [732/766] Linking target drivers/librte_power_acpi.so.25.0 00:01:46.172 [733/766] Linking target drivers/librte_power_cppc.so.25.0 00:01:46.172 [734/766] Linking target drivers/librte_power_kvm_vm.so.25.0 00:01:46.172 [735/766] Linking target drivers/librte_power_intel_uncore.so.25.0 00:01:46.432 [736/766] Generating symbol file lib/librte_graph.so.25.0.p/librte_graph.so.25.0.symbols 00:01:46.432 [737/766] Generating symbol file lib/librte_port.so.25.0.p/librte_port.so.25.0.symbols 00:01:46.432 [738/766] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:46.432 [739/766] Linking target lib/librte_node.so.25.0 00:01:46.432 [740/766] Linking target lib/librte_table.so.25.0 00:01:46.432 [741/766] Linking static target lib/librte_vhost.a 00:01:46.432 [742/766] Generating symbol file lib/librte_table.so.25.0.p/librte_table.so.25.0.symbols 00:01:47.000 [743/766] Linking target app/dpdk-pdump 00:01:47.000 [744/766] Linking target app/dpdk-dumpcap 00:01:47.000 [745/766] Linking target app/dpdk-test-fib 00:01:47.000 [746/766] Linking target app/dpdk-test-crypto-perf 00:01:47.000 [747/766] Linking target app/dpdk-test-security-perf 00:01:47.000 [748/766] Linking target app/dpdk-test-regex 00:01:47.000 [749/766] Linking target app/dpdk-test-dma-perf 00:01:47.000 [750/766] Linking target app/dpdk-test-bbdev 00:01:47.000 [751/766] Linking target app/dpdk-test-cmdline 00:01:47.000 [752/766] Linking target app/dpdk-test-sad 00:01:47.000 [753/766] Linking target app/dpdk-test-mldev 00:01:47.000 [754/766] Linking target app/dpdk-test-gpudev 00:01:47.000 [755/766] Linking target app/dpdk-test-compress-perf 00:01:47.000 [756/766] Linking target app/dpdk-test-acl 00:01:47.000 [757/766] Linking target app/dpdk-proc-info 00:01:47.000 [758/766] Linking target app/dpdk-test-flow-perf 00:01:47.000 [759/766] Linking target app/dpdk-test-pipeline 00:01:47.000 [760/766] Linking target app/dpdk-test-eventdev 00:01:47.000 [761/766] Linking target app/dpdk-graph 00:01:47.000 [762/766] Linking target app/dpdk-testpmd 00:01:48.918 [763/766] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.918 [764/766] Linking target lib/librte_vhost.so.25.0 00:01:50.926 [765/766] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.926 [766/766] Linking target lib/librte_pipeline.so.25.0 00:01:50.926 15:38:58 build_native_dpdk -- common/autobuild_common.sh@201 -- $ uname -s 00:01:50.926 15:38:58 build_native_dpdk -- common/autobuild_common.sh@201 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:01:50.926 15:38:58 build_native_dpdk -- common/autobuild_common.sh@214 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:01:50.926 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:50.926 [0/1] Installing files. 00:01:51.214 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/memory.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/cpu.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/counters.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:01:51.214 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipv6_addr_swap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipv6_addr_swap.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.214 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_eddsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_skeleton.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_gre.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_mpls.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_ipv4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_ipv4.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.215 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:51.216 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.217 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.218 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:51.219 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:51.220 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:51.220 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:51.220 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:51.220 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:51.220 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:51.220 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:51.220 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:51.220 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:51.220 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_log.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_kvargs.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_argparse.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_argparse.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_telemetry.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_eal.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_rcu.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_mempool.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_mbuf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_net.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_meter.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_ethdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_cmdline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_metrics.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_hash.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_timer.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_acl.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_bbdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.220 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_bitratestats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_bpf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_cfgfile.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_compressdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_cryptodev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_distributor.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_dmadev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_efd.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_eventdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_dispatcher.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_gpudev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_gro.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_gso.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_ip_frag.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_jobstats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_latencystats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_lpm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_member.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_pcapng.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_power.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_rawdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_regexdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_mldev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_rib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_reorder.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_sched.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_security.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_stack.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_vhost.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_ipsec.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_pdcp.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_fib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_port.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_pdump.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_table.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_pipeline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_graph.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing lib/librte_node.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing drivers/librte_bus_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.479 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing drivers/librte_bus_vdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.479 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.479 Installing drivers/librte_mempool_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.740 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.740 Installing drivers/librte_net_i40e.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.740 Installing drivers/librte_power_acpi.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.740 Installing drivers/librte_power_acpi.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.740 Installing drivers/librte_power_amd_pstate.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.740 Installing drivers/librte_power_amd_pstate.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.740 Installing drivers/librte_power_cppc.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.740 Installing drivers/librte_power_cppc.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.740 Installing drivers/librte_power_intel_pstate.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.740 Installing drivers/librte_power_intel_pstate.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.740 Installing drivers/librte_power_intel_uncore.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.740 Installing drivers/librte_power_intel_uncore.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.740 Installing drivers/librte_power_kvm_vm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:51.740 Installing drivers/librte_power_kvm_vm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:01:51.740 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/argparse/rte_argparse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitset.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore_var.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ptr_compress/rte_ptr_compress.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_cksum.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip4.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/power_cpufreq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/power_uncore_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_cpufreq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_qos.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.742 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/power/kvm_vm/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry-exporter.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:51.743 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:51.743 Installing symlink pointing to librte_log.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.25 00:01:51.743 Installing symlink pointing to librte_log.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:01:51.743 Installing symlink pointing to librte_kvargs.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.25 00:01:51.743 Installing symlink pointing to librte_kvargs.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:01:51.743 Installing symlink pointing to librte_argparse.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_argparse.so.25 00:01:51.743 Installing symlink pointing to librte_argparse.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_argparse.so 00:01:51.743 Installing symlink pointing to librte_telemetry.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.25 00:01:51.743 Installing symlink pointing to librte_telemetry.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:01:51.743 Installing symlink pointing to librte_eal.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.25 00:01:51.743 Installing symlink pointing to librte_eal.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:01:51.743 Installing symlink pointing to librte_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.25 00:01:51.743 Installing symlink pointing to librte_ring.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:01:51.743 Installing symlink pointing to librte_rcu.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.25 00:01:51.743 Installing symlink pointing to librte_rcu.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:01:51.743 Installing symlink pointing to librte_mempool.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.25 00:01:51.743 Installing symlink pointing to librte_mempool.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:01:51.743 Installing symlink pointing to librte_mbuf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.25 00:01:51.743 Installing symlink pointing to librte_mbuf.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:01:51.743 Installing symlink pointing to librte_net.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.25 00:01:51.743 Installing symlink pointing to librte_net.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:01:51.743 Installing symlink pointing to librte_meter.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.25 00:01:51.743 Installing symlink pointing to librte_meter.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:01:51.743 Installing symlink pointing to librte_ethdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.25 00:01:51.743 Installing symlink pointing to librte_ethdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:01:51.743 Installing symlink pointing to librte_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.25 00:01:51.743 Installing symlink pointing to librte_pci.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:01:51.743 Installing symlink pointing to librte_cmdline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.25 00:01:51.743 Installing symlink pointing to librte_cmdline.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:01:51.743 Installing symlink pointing to librte_metrics.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.25 00:01:51.743 Installing symlink pointing to librte_metrics.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:01:51.743 Installing symlink pointing to librte_hash.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.25 00:01:51.743 Installing symlink pointing to librte_hash.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:01:51.743 Installing symlink pointing to librte_timer.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.25 00:01:51.743 Installing symlink pointing to librte_timer.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:01:51.743 Installing symlink pointing to librte_acl.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.25 00:01:51.743 Installing symlink pointing to librte_acl.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:01:51.743 Installing symlink pointing to librte_bbdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.25 00:01:51.743 Installing symlink pointing to librte_bbdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:01:51.743 Installing symlink pointing to librte_bitratestats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.25 00:01:51.743 Installing symlink pointing to librte_bitratestats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:01:51.743 Installing symlink pointing to librte_bpf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.25 00:01:51.743 Installing symlink pointing to librte_bpf.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:01:51.743 Installing symlink pointing to librte_cfgfile.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.25 00:01:51.743 Installing symlink pointing to librte_cfgfile.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:01:51.743 Installing symlink pointing to librte_compressdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.25 00:01:51.743 Installing symlink pointing to librte_compressdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:01:51.743 Installing symlink pointing to librte_cryptodev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.25 00:01:51.743 Installing symlink pointing to librte_cryptodev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:01:51.743 Installing symlink pointing to librte_distributor.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.25 00:01:51.743 Installing symlink pointing to librte_distributor.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:01:51.743 Installing symlink pointing to librte_dmadev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.25 00:01:51.743 Installing symlink pointing to librte_dmadev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:01:51.743 Installing symlink pointing to librte_efd.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.25 00:01:51.743 Installing symlink pointing to librte_efd.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:01:51.743 Installing symlink pointing to librte_eventdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.25 00:01:51.743 Installing symlink pointing to librte_eventdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:01:51.743 Installing symlink pointing to librte_dispatcher.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.25 00:01:51.743 Installing symlink pointing to librte_dispatcher.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:01:51.743 Installing symlink pointing to librte_gpudev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.25 00:01:51.743 Installing symlink pointing to librte_gpudev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:01:51.743 Installing symlink pointing to librte_gro.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.25 00:01:51.743 Installing symlink pointing to librte_gro.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:01:51.743 Installing symlink pointing to librte_gso.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.25 00:01:51.743 Installing symlink pointing to librte_gso.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:01:51.743 Installing symlink pointing to librte_ip_frag.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.25 00:01:51.743 Installing symlink pointing to librte_ip_frag.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:01:51.743 Installing symlink pointing to librte_jobstats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.25 00:01:51.743 Installing symlink pointing to librte_jobstats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:01:51.744 Installing symlink pointing to librte_latencystats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.25 00:01:51.744 Installing symlink pointing to librte_latencystats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:01:51.744 Installing symlink pointing to librte_lpm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.25 00:01:51.744 Installing symlink pointing to librte_lpm.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:01:51.744 Installing symlink pointing to librte_member.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.25 00:01:51.744 Installing symlink pointing to librte_member.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:01:51.744 Installing symlink pointing to librte_pcapng.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.25 00:01:51.744 Installing symlink pointing to librte_pcapng.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:01:51.744 Installing symlink pointing to librte_power.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.25 00:01:51.744 Installing symlink pointing to librte_power.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:01:51.744 Installing symlink pointing to librte_rawdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.25 00:01:51.744 Installing symlink pointing to librte_rawdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:01:51.744 Installing symlink pointing to librte_regexdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.25 00:01:51.744 Installing symlink pointing to librte_regexdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:01:51.744 Installing symlink pointing to librte_mldev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.25 00:01:51.744 Installing symlink pointing to librte_mldev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:01:51.744 Installing symlink pointing to librte_rib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.25 00:01:51.744 Installing symlink pointing to librte_rib.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:01:51.744 Installing symlink pointing to librte_reorder.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.25 00:01:51.744 Installing symlink pointing to librte_reorder.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:01:51.744 Installing symlink pointing to librte_sched.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.25 00:01:51.744 Installing symlink pointing to librte_sched.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:01:51.744 Installing symlink pointing to librte_security.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.25 00:01:51.744 Installing symlink pointing to librte_security.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:01:51.744 Installing symlink pointing to librte_stack.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.25 00:01:51.744 Installing symlink pointing to librte_stack.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:01:51.744 Installing symlink pointing to librte_vhost.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.25 00:01:51.744 Installing symlink pointing to librte_vhost.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:01:51.744 Installing symlink pointing to librte_ipsec.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.25 00:01:51.744 Installing symlink pointing to librte_ipsec.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:01:51.744 Installing symlink pointing to librte_pdcp.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.25 00:01:51.744 Installing symlink pointing to librte_pdcp.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:01:51.744 Installing symlink pointing to librte_fib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.25 00:01:51.744 Installing symlink pointing to librte_fib.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:01:51.744 Installing symlink pointing to librte_port.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.25 00:01:51.744 Installing symlink pointing to librte_port.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:01:51.744 Installing symlink pointing to librte_pdump.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.25 00:01:51.744 Installing symlink pointing to librte_pdump.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:01:51.744 Installing symlink pointing to librte_table.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.25 00:01:51.744 Installing symlink pointing to librte_table.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:01:51.744 Installing symlink pointing to librte_pipeline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.25 00:01:51.744 Installing symlink pointing to librte_pipeline.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:01:51.744 Installing symlink pointing to librte_graph.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.25 00:01:51.744 Installing symlink pointing to librte_graph.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:01:51.744 Installing symlink pointing to librte_node.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.25 00:01:51.744 Installing symlink pointing to librte_node.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:01:51.744 Installing symlink pointing to librte_bus_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_pci.so.25 00:01:51.744 Installing symlink pointing to librte_bus_pci.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_pci.so 00:01:51.744 Installing symlink pointing to librte_bus_vdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_vdev.so.25 00:01:51.744 Installing symlink pointing to librte_bus_vdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_vdev.so 00:01:51.744 Installing symlink pointing to librte_mempool_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_mempool_ring.so.25 00:01:51.744 Installing symlink pointing to librte_mempool_ring.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_mempool_ring.so 00:01:51.744 Installing symlink pointing to librte_net_i40e.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_net_i40e.so.25 00:01:51.744 Installing symlink pointing to librte_net_i40e.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_net_i40e.so 00:01:51.744 Installing symlink pointing to librte_power_acpi.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_acpi.so.25 00:01:51.744 Installing symlink pointing to librte_power_acpi.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_acpi.so 00:01:51.744 './librte_bus_pci.so' -> 'dpdk/pmds-25.0/librte_bus_pci.so' 00:01:51.744 './librte_bus_pci.so.25' -> 'dpdk/pmds-25.0/librte_bus_pci.so.25' 00:01:51.744 './librte_bus_pci.so.25.0' -> 'dpdk/pmds-25.0/librte_bus_pci.so.25.0' 00:01:51.744 './librte_bus_vdev.so' -> 'dpdk/pmds-25.0/librte_bus_vdev.so' 00:01:51.744 './librte_bus_vdev.so.25' -> 'dpdk/pmds-25.0/librte_bus_vdev.so.25' 00:01:51.744 './librte_bus_vdev.so.25.0' -> 'dpdk/pmds-25.0/librte_bus_vdev.so.25.0' 00:01:51.744 './librte_mempool_ring.so' -> 'dpdk/pmds-25.0/librte_mempool_ring.so' 00:01:51.744 './librte_mempool_ring.so.25' -> 'dpdk/pmds-25.0/librte_mempool_ring.so.25' 00:01:51.744 './librte_mempool_ring.so.25.0' -> 'dpdk/pmds-25.0/librte_mempool_ring.so.25.0' 00:01:51.744 './librte_net_i40e.so' -> 'dpdk/pmds-25.0/librte_net_i40e.so' 00:01:51.744 './librte_net_i40e.so.25' -> 'dpdk/pmds-25.0/librte_net_i40e.so.25' 00:01:51.744 './librte_net_i40e.so.25.0' -> 'dpdk/pmds-25.0/librte_net_i40e.so.25.0' 00:01:51.744 './librte_power_acpi.so' -> 'dpdk/pmds-25.0/librte_power_acpi.so' 00:01:51.744 './librte_power_acpi.so.25' -> 'dpdk/pmds-25.0/librte_power_acpi.so.25' 00:01:51.744 './librte_power_acpi.so.25.0' -> 'dpdk/pmds-25.0/librte_power_acpi.so.25.0' 00:01:51.744 './librte_power_amd_pstate.so' -> 'dpdk/pmds-25.0/librte_power_amd_pstate.so' 00:01:51.744 './librte_power_amd_pstate.so.25' -> 'dpdk/pmds-25.0/librte_power_amd_pstate.so.25' 00:01:51.744 './librte_power_amd_pstate.so.25.0' -> 'dpdk/pmds-25.0/librte_power_amd_pstate.so.25.0' 00:01:51.744 './librte_power_cppc.so' -> 'dpdk/pmds-25.0/librte_power_cppc.so' 00:01:51.744 './librte_power_cppc.so.25' -> 'dpdk/pmds-25.0/librte_power_cppc.so.25' 00:01:51.744 './librte_power_cppc.so.25.0' -> 'dpdk/pmds-25.0/librte_power_cppc.so.25.0' 00:01:51.744 './librte_power_intel_pstate.so' -> 'dpdk/pmds-25.0/librte_power_intel_pstate.so' 00:01:51.744 './librte_power_intel_pstate.so.25' -> 'dpdk/pmds-25.0/librte_power_intel_pstate.so.25' 00:01:51.744 './librte_power_intel_pstate.so.25.0' -> 'dpdk/pmds-25.0/librte_power_intel_pstate.so.25.0' 00:01:51.744 './librte_power_intel_uncore.so' -> 'dpdk/pmds-25.0/librte_power_intel_uncore.so' 00:01:51.744 './librte_power_intel_uncore.so.25' -> 'dpdk/pmds-25.0/librte_power_intel_uncore.so.25' 00:01:51.744 './librte_power_intel_uncore.so.25.0' -> 'dpdk/pmds-25.0/librte_power_intel_uncore.so.25.0' 00:01:51.744 './librte_power_kvm_vm.so' -> 'dpdk/pmds-25.0/librte_power_kvm_vm.so' 00:01:51.744 './librte_power_kvm_vm.so.25' -> 'dpdk/pmds-25.0/librte_power_kvm_vm.so.25' 00:01:51.744 './librte_power_kvm_vm.so.25.0' -> 'dpdk/pmds-25.0/librte_power_kvm_vm.so.25.0' 00:01:51.744 Installing symlink pointing to librte_power_amd_pstate.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_amd_pstate.so.25 00:01:51.744 Installing symlink pointing to librte_power_amd_pstate.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_amd_pstate.so 00:01:51.744 Installing symlink pointing to librte_power_cppc.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_cppc.so.25 00:01:51.744 Installing symlink pointing to librte_power_cppc.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_cppc.so 00:01:51.744 Installing symlink pointing to librte_power_intel_pstate.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_intel_pstate.so.25 00:01:51.744 Installing symlink pointing to librte_power_intel_pstate.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_intel_pstate.so 00:01:51.744 Installing symlink pointing to librte_power_intel_uncore.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_intel_uncore.so.25 00:01:51.744 Installing symlink pointing to librte_power_intel_uncore.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_intel_uncore.so 00:01:51.744 Installing symlink pointing to librte_power_kvm_vm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_kvm_vm.so.25 00:01:51.744 Installing symlink pointing to librte_power_kvm_vm.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_kvm_vm.so 00:01:51.744 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-25.0' 00:01:51.744 15:38:59 build_native_dpdk -- common/autobuild_common.sh@220 -- $ cat 00:01:51.744 15:38:59 build_native_dpdk -- common/autobuild_common.sh@225 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:51.744 00:01:51.744 real 0m28.160s 00:01:51.744 user 8m29.780s 00:01:51.744 sys 2m45.664s 00:01:51.744 15:38:59 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:01:51.744 15:38:59 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:01:51.744 ************************************ 00:01:51.744 END TEST build_native_dpdk 00:01:51.744 ************************************ 00:01:51.744 15:38:59 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:51.744 15:38:59 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:51.744 15:38:59 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:51.744 15:38:59 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:51.744 15:38:59 -- common/autobuild_common.sh@445 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:51.744 15:38:59 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:01:51.744 15:38:59 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:51.744 15:38:59 -- common/autotest_common.sh@10 -- $ set +x 00:01:51.744 ************************************ 00:01:51.744 START TEST autobuild_llvm_precompile 00:01:51.744 ************************************ 00:01:51.744 15:38:59 autobuild_llvm_precompile -- common/autotest_common.sh@1129 -- $ _llvm_precompile 00:01:51.744 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:01:52.003 Target: x86_64-redhat-linux-gnu 00:01:52.003 Thread model: posix 00:01:52.003 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:01:52.003 15:38:59 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:52.003 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:52.263 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:52.263 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:52.523 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:52.782 Using 'verbs' RDMA provider 00:02:08.633 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:20.848 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:21.416 Creating mk/config.mk...done. 00:02:21.416 Creating mk/cc.flags.mk...done. 00:02:21.416 Type 'make' to build. 00:02:21.416 00:02:21.416 real 0m29.465s 00:02:21.416 user 0m13.006s 00:02:21.416 sys 0m15.784s 00:02:21.416 15:39:29 autobuild_llvm_precompile -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:21.416 15:39:29 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:02:21.416 ************************************ 00:02:21.416 END TEST autobuild_llvm_precompile 00:02:21.416 ************************************ 00:02:21.416 15:39:29 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:21.416 15:39:29 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:21.416 15:39:29 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:21.416 15:39:29 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:21.416 15:39:29 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:21.675 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:21.675 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:21.675 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:21.934 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:22.193 Using 'verbs' RDMA provider 00:02:35.359 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:47.568 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:47.568 Creating mk/config.mk...done. 00:02:47.568 Creating mk/cc.flags.mk...done. 00:02:47.568 Type 'make' to build. 00:02:47.568 15:39:55 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:02:47.568 15:39:55 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:02:47.568 15:39:55 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:47.568 15:39:55 -- common/autotest_common.sh@10 -- $ set +x 00:02:47.568 ************************************ 00:02:47.568 START TEST make 00:02:47.568 ************************************ 00:02:47.568 15:39:55 make -- common/autotest_common.sh@1129 -- $ make -j112 00:02:47.568 make[1]: Nothing to be done for 'all'. 00:02:49.471 The Meson build system 00:02:49.471 Version: 1.5.0 00:02:49.471 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:49.471 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:49.471 Build type: native build 00:02:49.471 Project name: libvfio-user 00:02:49.471 Project version: 0.0.1 00:02:49.471 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:49.471 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:49.471 Host machine cpu family: x86_64 00:02:49.471 Host machine cpu: x86_64 00:02:49.471 Run-time dependency threads found: YES 00:02:49.471 Library dl found: YES 00:02:49.471 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:49.471 Run-time dependency json-c found: YES 0.17 00:02:49.471 Run-time dependency cmocka found: YES 1.1.7 00:02:49.471 Program pytest-3 found: NO 00:02:49.471 Program flake8 found: NO 00:02:49.471 Program misspell-fixer found: NO 00:02:49.471 Program restructuredtext-lint found: NO 00:02:49.471 Program valgrind found: YES (/usr/bin/valgrind) 00:02:49.471 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:49.471 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:49.471 Compiler for C supports arguments -Wwrite-strings: YES 00:02:49.471 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:49.471 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:49.471 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:49.471 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:49.471 Build targets in project: 8 00:02:49.471 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:49.471 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:49.471 00:02:49.471 libvfio-user 0.0.1 00:02:49.471 00:02:49.471 User defined options 00:02:49.471 buildtype : debug 00:02:49.471 default_library: static 00:02:49.471 libdir : /usr/local/lib 00:02:49.471 00:02:49.471 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:49.471 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:49.729 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:49.729 [2/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:49.729 [3/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:49.729 [4/36] Compiling C object samples/null.p/null.c.o 00:02:49.729 [5/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:49.729 [6/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:49.729 [7/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:49.729 [8/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:49.729 [9/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:49.729 [10/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:49.729 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:49.729 [12/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:49.729 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:49.729 [14/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:49.729 [15/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:49.729 [16/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:49.729 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:49.729 [18/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:49.729 [19/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:49.729 [20/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:49.729 [21/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:49.729 [22/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:49.729 [23/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:49.729 [24/36] Compiling C object samples/server.p/server.c.o 00:02:49.729 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:49.729 [26/36] Compiling C object samples/client.p/client.c.o 00:02:49.729 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:49.729 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:49.729 [29/36] Linking target samples/client 00:02:49.729 [30/36] Linking static target lib/libvfio-user.a 00:02:49.729 [31/36] Linking target test/unit_tests 00:02:49.729 [32/36] Linking target samples/server 00:02:49.729 [33/36] Linking target samples/gpio-pci-idio-16 00:02:49.729 [34/36] Linking target samples/lspci 00:02:49.729 [35/36] Linking target samples/shadow_ioeventfd_server 00:02:49.729 [36/36] Linking target samples/null 00:02:49.729 INFO: autodetecting backend as ninja 00:02:49.729 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:49.986 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:50.244 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:50.244 ninja: no work to do. 00:03:05.149 CC lib/log/log_flags.o 00:03:05.149 CC lib/log/log_deprecated.o 00:03:05.149 CC lib/log/log.o 00:03:05.149 CC lib/ut_mock/mock.o 00:03:05.149 CC lib/ut/ut.o 00:03:05.149 LIB libspdk_log.a 00:03:05.149 LIB libspdk_ut_mock.a 00:03:05.149 LIB libspdk_ut.a 00:03:05.149 CC lib/util/base64.o 00:03:05.149 CC lib/util/bit_array.o 00:03:05.149 CC lib/util/cpuset.o 00:03:05.149 CC lib/util/crc16.o 00:03:05.149 CC lib/util/crc32.o 00:03:05.149 CC lib/util/crc32_ieee.o 00:03:05.149 CC lib/util/crc32c.o 00:03:05.149 CC lib/util/crc64.o 00:03:05.149 CC lib/util/dif.o 00:03:05.149 CC lib/util/fd.o 00:03:05.149 CC lib/util/file.o 00:03:05.149 CC lib/util/fd_group.o 00:03:05.149 CC lib/util/hexlify.o 00:03:05.149 CC lib/util/net.o 00:03:05.149 CC lib/util/iov.o 00:03:05.149 CC lib/util/math.o 00:03:05.149 CC lib/util/pipe.o 00:03:05.149 CC lib/util/strerror_tls.o 00:03:05.149 CC lib/util/uuid.o 00:03:05.149 CC lib/util/string.o 00:03:05.149 CC lib/util/zipf.o 00:03:05.149 CC lib/util/xor.o 00:03:05.149 CC lib/util/md5.o 00:03:05.149 CXX lib/trace_parser/trace.o 00:03:05.149 CC lib/dma/dma.o 00:03:05.149 CC lib/ioat/ioat.o 00:03:05.149 CC lib/vfio_user/host/vfio_user_pci.o 00:03:05.149 CC lib/vfio_user/host/vfio_user.o 00:03:05.149 LIB libspdk_dma.a 00:03:05.149 LIB libspdk_ioat.a 00:03:05.149 LIB libspdk_vfio_user.a 00:03:05.149 LIB libspdk_util.a 00:03:05.149 LIB libspdk_trace_parser.a 00:03:05.149 CC lib/rdma_utils/rdma_utils.o 00:03:05.149 CC lib/conf/conf.o 00:03:05.149 CC lib/idxd/idxd.o 00:03:05.149 CC lib/json/json_parse.o 00:03:05.149 CC lib/idxd/idxd_user.o 00:03:05.149 CC lib/vmd/vmd.o 00:03:05.149 CC lib/idxd/idxd_kernel.o 00:03:05.149 CC lib/vmd/led.o 00:03:05.149 CC lib/json/json_write.o 00:03:05.149 CC lib/json/json_util.o 00:03:05.149 CC lib/env_dpdk/pci.o 00:03:05.149 CC lib/env_dpdk/env.o 00:03:05.149 CC lib/env_dpdk/memory.o 00:03:05.149 CC lib/env_dpdk/threads.o 00:03:05.149 CC lib/env_dpdk/init.o 00:03:05.149 CC lib/env_dpdk/pci_ioat.o 00:03:05.149 CC lib/env_dpdk/pci_virtio.o 00:03:05.149 CC lib/env_dpdk/pci_vmd.o 00:03:05.149 CC lib/env_dpdk/pci_idxd.o 00:03:05.149 CC lib/env_dpdk/pci_event.o 00:03:05.149 CC lib/env_dpdk/sigbus_handler.o 00:03:05.149 CC lib/env_dpdk/pci_dpdk.o 00:03:05.149 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:05.149 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:05.149 LIB libspdk_conf.a 00:03:05.149 LIB libspdk_rdma_utils.a 00:03:05.149 LIB libspdk_json.a 00:03:05.149 LIB libspdk_idxd.a 00:03:05.149 LIB libspdk_vmd.a 00:03:05.149 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:05.149 CC lib/rdma_provider/common.o 00:03:05.149 CC lib/jsonrpc/jsonrpc_server.o 00:03:05.149 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:05.149 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:05.149 CC lib/jsonrpc/jsonrpc_client.o 00:03:05.149 LIB libspdk_rdma_provider.a 00:03:05.149 LIB libspdk_jsonrpc.a 00:03:05.149 LIB libspdk_env_dpdk.a 00:03:05.149 CC lib/rpc/rpc.o 00:03:05.149 LIB libspdk_rpc.a 00:03:05.149 CC lib/notify/notify.o 00:03:05.149 CC lib/notify/notify_rpc.o 00:03:05.149 CC lib/keyring/keyring.o 00:03:05.149 CC lib/keyring/keyring_rpc.o 00:03:05.149 CC lib/trace/trace_rpc.o 00:03:05.149 CC lib/trace/trace.o 00:03:05.149 CC lib/trace/trace_flags.o 00:03:05.408 LIB libspdk_notify.a 00:03:05.408 LIB libspdk_trace.a 00:03:05.408 LIB libspdk_keyring.a 00:03:05.667 CC lib/sock/sock.o 00:03:05.667 CC lib/sock/sock_rpc.o 00:03:05.667 CC lib/thread/thread.o 00:03:05.667 CC lib/thread/iobuf.o 00:03:05.926 LIB libspdk_sock.a 00:03:06.185 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:06.185 CC lib/nvme/nvme_ctrlr.o 00:03:06.185 CC lib/nvme/nvme_ns_cmd.o 00:03:06.185 CC lib/nvme/nvme_fabric.o 00:03:06.185 CC lib/nvme/nvme_pcie_common.o 00:03:06.185 CC lib/nvme/nvme_ns.o 00:03:06.185 CC lib/nvme/nvme.o 00:03:06.185 CC lib/nvme/nvme_pcie.o 00:03:06.185 CC lib/nvme/nvme_qpair.o 00:03:06.185 CC lib/nvme/nvme_transport.o 00:03:06.185 CC lib/nvme/nvme_quirks.o 00:03:06.185 CC lib/nvme/nvme_discovery.o 00:03:06.185 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:06.185 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:06.185 CC lib/nvme/nvme_tcp.o 00:03:06.185 CC lib/nvme/nvme_opal.o 00:03:06.185 CC lib/nvme/nvme_io_msg.o 00:03:06.185 CC lib/nvme/nvme_poll_group.o 00:03:06.185 CC lib/nvme/nvme_zns.o 00:03:06.185 CC lib/nvme/nvme_stubs.o 00:03:06.185 CC lib/nvme/nvme_auth.o 00:03:06.185 CC lib/nvme/nvme_cuse.o 00:03:06.185 CC lib/nvme/nvme_vfio_user.o 00:03:06.185 CC lib/nvme/nvme_rdma.o 00:03:06.443 LIB libspdk_thread.a 00:03:06.701 CC lib/virtio/virtio.o 00:03:06.701 CC lib/virtio/virtio_vfio_user.o 00:03:06.701 CC lib/virtio/virtio_vhost_user.o 00:03:06.701 CC lib/virtio/virtio_pci.o 00:03:06.701 CC lib/fsdev/fsdev.o 00:03:06.701 CC lib/fsdev/fsdev_io.o 00:03:06.701 CC lib/init/json_config.o 00:03:06.701 CC lib/fsdev/fsdev_rpc.o 00:03:06.701 CC lib/init/rpc.o 00:03:06.701 CC lib/init/subsystem.o 00:03:06.701 CC lib/init/subsystem_rpc.o 00:03:06.701 CC lib/accel/accel.o 00:03:06.701 CC lib/accel/accel_rpc.o 00:03:06.701 CC lib/accel/accel_sw.o 00:03:06.701 CC lib/vfu_tgt/tgt_endpoint.o 00:03:06.701 CC lib/vfu_tgt/tgt_rpc.o 00:03:06.701 CC lib/blob/blob_bs_dev.o 00:03:06.701 CC lib/blob/blobstore.o 00:03:06.701 CC lib/blob/request.o 00:03:06.701 CC lib/blob/zeroes.o 00:03:06.960 LIB libspdk_init.a 00:03:06.960 LIB libspdk_virtio.a 00:03:06.960 LIB libspdk_vfu_tgt.a 00:03:06.960 LIB libspdk_fsdev.a 00:03:07.219 CC lib/event/app.o 00:03:07.219 CC lib/event/app_rpc.o 00:03:07.219 CC lib/event/reactor.o 00:03:07.219 CC lib/event/log_rpc.o 00:03:07.219 CC lib/event/scheduler_static.o 00:03:07.477 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:07.477 LIB libspdk_accel.a 00:03:07.477 LIB libspdk_event.a 00:03:07.477 LIB libspdk_nvme.a 00:03:07.735 LIB libspdk_fuse_dispatcher.a 00:03:07.735 CC lib/bdev/bdev.o 00:03:07.735 CC lib/bdev/part.o 00:03:07.735 CC lib/bdev/bdev_rpc.o 00:03:07.735 CC lib/bdev/bdev_zone.o 00:03:07.735 CC lib/bdev/scsi_nvme.o 00:03:08.672 LIB libspdk_blob.a 00:03:08.672 CC lib/blobfs/blobfs.o 00:03:08.672 CC lib/blobfs/tree.o 00:03:08.672 CC lib/lvol/lvol.o 00:03:09.240 LIB libspdk_lvol.a 00:03:09.240 LIB libspdk_blobfs.a 00:03:09.499 LIB libspdk_bdev.a 00:03:09.757 CC lib/ftl/ftl_init.o 00:03:09.757 CC lib/ftl/ftl_core.o 00:03:09.757 CC lib/ftl/ftl_io.o 00:03:09.757 CC lib/ftl/ftl_layout.o 00:03:09.757 CC lib/ftl/ftl_sb.o 00:03:09.757 CC lib/ftl/ftl_debug.o 00:03:09.757 CC lib/ftl/ftl_nv_cache.o 00:03:09.757 CC lib/ftl/ftl_l2p.o 00:03:09.757 CC lib/ftl/ftl_l2p_flat.o 00:03:09.757 CC lib/ftl/ftl_band_ops.o 00:03:09.757 CC lib/ftl/ftl_band.o 00:03:09.757 CC lib/ftl/ftl_writer.o 00:03:09.757 CC lib/ftl/ftl_rq.o 00:03:09.757 CC lib/ftl/ftl_reloc.o 00:03:09.757 CC lib/ftl/ftl_l2p_cache.o 00:03:09.757 CC lib/ftl/ftl_p2l.o 00:03:09.757 CC lib/ftl/ftl_p2l_log.o 00:03:09.757 CC lib/scsi/port.o 00:03:09.757 CC lib/scsi/dev.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt.o 00:03:09.757 CC lib/scsi/lun.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:09.757 CC lib/scsi/scsi.o 00:03:09.757 CC lib/nvmf/ctrlr_discovery.o 00:03:09.757 CC lib/nvmf/ctrlr.o 00:03:09.757 CC lib/scsi/scsi_bdev.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:09.757 CC lib/scsi/scsi_pr.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:09.757 CC lib/nvmf/ctrlr_bdev.o 00:03:09.757 CC lib/scsi/scsi_rpc.o 00:03:09.757 CC lib/nvmf/subsystem.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:09.757 CC lib/ublk/ublk.o 00:03:09.757 CC lib/scsi/task.o 00:03:09.757 CC lib/nvmf/transport.o 00:03:09.757 CC lib/ublk/ublk_rpc.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:09.757 CC lib/nvmf/nvmf.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:09.757 CC lib/nvmf/nvmf_rpc.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:09.757 CC lib/nvmf/tcp.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:09.757 CC lib/nvmf/stubs.o 00:03:09.757 CC lib/nvmf/mdns_server.o 00:03:09.757 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:09.757 CC lib/nvmf/rdma.o 00:03:09.757 CC lib/nvmf/vfio_user.o 00:03:09.757 CC lib/ftl/utils/ftl_conf.o 00:03:09.757 CC lib/ftl/utils/ftl_md.o 00:03:09.757 CC lib/nvmf/auth.o 00:03:09.757 CC lib/nbd/nbd.o 00:03:09.757 CC lib/ftl/utils/ftl_bitmap.o 00:03:09.757 CC lib/nbd/nbd_rpc.o 00:03:09.757 CC lib/ftl/utils/ftl_mempool.o 00:03:09.757 CC lib/ftl/utils/ftl_property.o 00:03:09.757 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:09.757 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:09.757 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:09.757 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:09.757 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:09.757 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:09.757 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:09.757 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:09.757 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:09.757 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:09.757 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:09.757 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:09.757 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:09.757 CC lib/ftl/base/ftl_base_dev.o 00:03:09.757 CC lib/ftl/base/ftl_base_bdev.o 00:03:09.757 CC lib/ftl/ftl_trace.o 00:03:10.324 LIB libspdk_nbd.a 00:03:10.324 LIB libspdk_ublk.a 00:03:10.324 LIB libspdk_scsi.a 00:03:10.324 LIB libspdk_ftl.a 00:03:10.583 CC lib/vhost/vhost.o 00:03:10.583 CC lib/vhost/vhost_scsi.o 00:03:10.583 CC lib/vhost/vhost_rpc.o 00:03:10.583 CC lib/vhost/vhost_blk.o 00:03:10.583 CC lib/vhost/rte_vhost_user.o 00:03:10.583 CC lib/iscsi/iscsi.o 00:03:10.583 CC lib/iscsi/conn.o 00:03:10.583 CC lib/iscsi/init_grp.o 00:03:10.583 CC lib/iscsi/tgt_node.o 00:03:10.583 CC lib/iscsi/param.o 00:03:10.583 CC lib/iscsi/portal_grp.o 00:03:10.583 CC lib/iscsi/iscsi_subsystem.o 00:03:10.583 CC lib/iscsi/iscsi_rpc.o 00:03:10.583 CC lib/iscsi/task.o 00:03:10.841 LIB libspdk_nvmf.a 00:03:11.099 LIB libspdk_vhost.a 00:03:11.357 LIB libspdk_iscsi.a 00:03:11.988 CC module/env_dpdk/env_dpdk_rpc.o 00:03:11.988 CC module/vfu_device/vfu_virtio_blk.o 00:03:11.988 CC module/vfu_device/vfu_virtio.o 00:03:11.988 CC module/vfu_device/vfu_virtio_scsi.o 00:03:11.988 CC module/vfu_device/vfu_virtio_rpc.o 00:03:11.988 CC module/vfu_device/vfu_virtio_fs.o 00:03:11.988 LIB libspdk_env_dpdk_rpc.a 00:03:11.988 CC module/fsdev/aio/fsdev_aio.o 00:03:11.988 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:11.988 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:11.988 CC module/blob/bdev/blob_bdev.o 00:03:11.988 CC module/fsdev/aio/linux_aio_mgr.o 00:03:11.988 CC module/keyring/file/keyring.o 00:03:11.988 CC module/scheduler/gscheduler/gscheduler.o 00:03:11.988 CC module/keyring/file/keyring_rpc.o 00:03:11.988 CC module/keyring/linux/keyring.o 00:03:11.988 CC module/accel/error/accel_error.o 00:03:11.988 CC module/keyring/linux/keyring_rpc.o 00:03:11.988 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:11.988 CC module/sock/posix/posix.o 00:03:11.988 CC module/accel/error/accel_error_rpc.o 00:03:11.988 CC module/accel/iaa/accel_iaa.o 00:03:11.988 CC module/accel/iaa/accel_iaa_rpc.o 00:03:11.988 CC module/accel/ioat/accel_ioat.o 00:03:11.988 CC module/accel/ioat/accel_ioat_rpc.o 00:03:11.988 CC module/accel/dsa/accel_dsa.o 00:03:11.988 CC module/accel/dsa/accel_dsa_rpc.o 00:03:11.988 LIB libspdk_scheduler_dpdk_governor.a 00:03:11.988 LIB libspdk_keyring_linux.a 00:03:11.988 LIB libspdk_keyring_file.a 00:03:11.988 LIB libspdk_scheduler_gscheduler.a 00:03:11.988 LIB libspdk_accel_error.a 00:03:11.988 LIB libspdk_scheduler_dynamic.a 00:03:12.259 LIB libspdk_accel_iaa.a 00:03:12.259 LIB libspdk_accel_ioat.a 00:03:12.259 LIB libspdk_blob_bdev.a 00:03:12.259 LIB libspdk_accel_dsa.a 00:03:12.259 LIB libspdk_vfu_device.a 00:03:12.259 LIB libspdk_sock_posix.a 00:03:12.518 LIB libspdk_fsdev_aio.a 00:03:12.518 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:12.518 CC module/bdev/gpt/gpt.o 00:03:12.518 CC module/blobfs/bdev/blobfs_bdev.o 00:03:12.518 CC module/bdev/gpt/vbdev_gpt.o 00:03:12.518 CC module/bdev/error/vbdev_error.o 00:03:12.518 CC module/bdev/delay/vbdev_delay.o 00:03:12.518 CC module/bdev/error/vbdev_error_rpc.o 00:03:12.518 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:12.518 CC module/bdev/lvol/vbdev_lvol.o 00:03:12.518 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:12.518 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:12.518 CC module/bdev/ftl/bdev_ftl.o 00:03:12.518 CC module/bdev/aio/bdev_aio.o 00:03:12.518 CC module/bdev/aio/bdev_aio_rpc.o 00:03:12.518 CC module/bdev/null/bdev_null.o 00:03:12.518 CC module/bdev/raid/bdev_raid.o 00:03:12.518 CC module/bdev/null/bdev_null_rpc.o 00:03:12.518 CC module/bdev/iscsi/bdev_iscsi.o 00:03:12.518 CC module/bdev/raid/bdev_raid_rpc.o 00:03:12.518 CC module/bdev/passthru/vbdev_passthru.o 00:03:12.518 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:12.518 CC module/bdev/raid/bdev_raid_sb.o 00:03:12.518 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:12.518 CC module/bdev/raid/raid0.o 00:03:12.518 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:12.518 CC module/bdev/raid/raid1.o 00:03:12.518 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:12.518 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:12.518 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:12.518 CC module/bdev/malloc/bdev_malloc.o 00:03:12.518 CC module/bdev/raid/concat.o 00:03:12.518 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:12.518 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:12.518 CC module/bdev/split/vbdev_split.o 00:03:12.518 CC module/bdev/split/vbdev_split_rpc.o 00:03:12.518 CC module/bdev/nvme/bdev_nvme.o 00:03:12.518 CC module/bdev/nvme/nvme_rpc.o 00:03:12.518 CC module/bdev/nvme/bdev_mdns_client.o 00:03:12.518 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:12.518 CC module/bdev/nvme/vbdev_opal.o 00:03:12.518 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:12.518 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:12.777 LIB libspdk_blobfs_bdev.a 00:03:12.777 LIB libspdk_bdev_error.a 00:03:12.777 LIB libspdk_bdev_gpt.a 00:03:12.777 LIB libspdk_bdev_split.a 00:03:12.777 LIB libspdk_bdev_ftl.a 00:03:12.777 LIB libspdk_bdev_null.a 00:03:12.777 LIB libspdk_bdev_passthru.a 00:03:12.777 LIB libspdk_bdev_aio.a 00:03:12.777 LIB libspdk_bdev_delay.a 00:03:12.777 LIB libspdk_bdev_iscsi.a 00:03:12.777 LIB libspdk_bdev_zone_block.a 00:03:12.777 LIB libspdk_bdev_malloc.a 00:03:13.036 LIB libspdk_bdev_lvol.a 00:03:13.036 LIB libspdk_bdev_virtio.a 00:03:13.294 LIB libspdk_bdev_raid.a 00:03:14.228 LIB libspdk_bdev_nvme.a 00:03:14.486 CC module/event/subsystems/vmd/vmd.o 00:03:14.486 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:14.486 CC module/event/subsystems/scheduler/scheduler.o 00:03:14.486 CC module/event/subsystems/fsdev/fsdev.o 00:03:14.486 CC module/event/subsystems/sock/sock.o 00:03:14.486 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:14.486 CC module/event/subsystems/iobuf/iobuf.o 00:03:14.486 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:14.486 CC module/event/subsystems/keyring/keyring.o 00:03:14.486 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:14.744 LIB libspdk_event_vmd.a 00:03:14.744 LIB libspdk_event_fsdev.a 00:03:14.744 LIB libspdk_event_scheduler.a 00:03:14.744 LIB libspdk_event_vhost_blk.a 00:03:14.744 LIB libspdk_event_keyring.a 00:03:14.744 LIB libspdk_event_sock.a 00:03:14.744 LIB libspdk_event_vfu_tgt.a 00:03:14.744 LIB libspdk_event_iobuf.a 00:03:15.003 CC module/event/subsystems/accel/accel.o 00:03:15.261 LIB libspdk_event_accel.a 00:03:15.521 CC module/event/subsystems/bdev/bdev.o 00:03:15.521 LIB libspdk_event_bdev.a 00:03:15.780 CC module/event/subsystems/ublk/ublk.o 00:03:15.780 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:15.780 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:15.780 CC module/event/subsystems/scsi/scsi.o 00:03:15.780 CC module/event/subsystems/nbd/nbd.o 00:03:16.039 LIB libspdk_event_ublk.a 00:03:16.039 LIB libspdk_event_nbd.a 00:03:16.039 LIB libspdk_event_scsi.a 00:03:16.039 LIB libspdk_event_nvmf.a 00:03:16.298 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:16.298 CC module/event/subsystems/iscsi/iscsi.o 00:03:16.298 LIB libspdk_event_vhost_scsi.a 00:03:16.558 LIB libspdk_event_iscsi.a 00:03:16.820 CC app/trace_record/trace_record.o 00:03:16.820 CC app/spdk_nvme_identify/identify.o 00:03:16.820 CC app/spdk_nvme_perf/perf.o 00:03:16.820 CC test/rpc_client/rpc_client_test.o 00:03:16.820 CC app/spdk_top/spdk_top.o 00:03:16.820 TEST_HEADER include/spdk/accel.h 00:03:16.820 TEST_HEADER include/spdk/accel_module.h 00:03:16.820 CC app/spdk_lspci/spdk_lspci.o 00:03:16.820 TEST_HEADER include/spdk/bdev.h 00:03:16.820 TEST_HEADER include/spdk/base64.h 00:03:16.820 TEST_HEADER include/spdk/assert.h 00:03:16.820 TEST_HEADER include/spdk/barrier.h 00:03:16.820 TEST_HEADER include/spdk/bdev_module.h 00:03:16.820 TEST_HEADER include/spdk/bdev_zone.h 00:03:16.820 CC app/spdk_nvme_discover/discovery_aer.o 00:03:16.820 TEST_HEADER include/spdk/bit_array.h 00:03:16.820 TEST_HEADER include/spdk/bit_pool.h 00:03:16.820 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:16.820 TEST_HEADER include/spdk/blob_bdev.h 00:03:16.820 CXX app/trace/trace.o 00:03:16.820 TEST_HEADER include/spdk/blobfs.h 00:03:16.820 TEST_HEADER include/spdk/blob.h 00:03:16.820 TEST_HEADER include/spdk/conf.h 00:03:16.820 TEST_HEADER include/spdk/cpuset.h 00:03:16.820 TEST_HEADER include/spdk/config.h 00:03:16.820 TEST_HEADER include/spdk/crc16.h 00:03:16.820 TEST_HEADER include/spdk/crc32.h 00:03:16.820 TEST_HEADER include/spdk/crc64.h 00:03:16.820 TEST_HEADER include/spdk/dif.h 00:03:16.820 TEST_HEADER include/spdk/dma.h 00:03:16.820 TEST_HEADER include/spdk/env_dpdk.h 00:03:16.820 TEST_HEADER include/spdk/endian.h 00:03:16.820 TEST_HEADER include/spdk/env.h 00:03:16.820 TEST_HEADER include/spdk/fd.h 00:03:16.820 TEST_HEADER include/spdk/event.h 00:03:16.820 TEST_HEADER include/spdk/fd_group.h 00:03:16.820 TEST_HEADER include/spdk/file.h 00:03:16.820 TEST_HEADER include/spdk/fsdev_module.h 00:03:16.820 CC app/spdk_dd/spdk_dd.o 00:03:16.820 TEST_HEADER include/spdk/ftl.h 00:03:16.820 TEST_HEADER include/spdk/fsdev.h 00:03:16.820 TEST_HEADER include/spdk/gpt_spec.h 00:03:16.820 TEST_HEADER include/spdk/fuse_dispatcher.h 00:03:16.820 TEST_HEADER include/spdk/hexlify.h 00:03:16.820 TEST_HEADER include/spdk/idxd_spec.h 00:03:16.820 TEST_HEADER include/spdk/histogram_data.h 00:03:16.820 TEST_HEADER include/spdk/init.h 00:03:16.820 TEST_HEADER include/spdk/ioat.h 00:03:16.820 TEST_HEADER include/spdk/idxd.h 00:03:16.820 TEST_HEADER include/spdk/ioat_spec.h 00:03:16.820 TEST_HEADER include/spdk/jsonrpc.h 00:03:16.820 TEST_HEADER include/spdk/iscsi_spec.h 00:03:16.820 TEST_HEADER include/spdk/keyring.h 00:03:16.820 TEST_HEADER include/spdk/json.h 00:03:16.820 TEST_HEADER include/spdk/keyring_module.h 00:03:16.820 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:16.820 TEST_HEADER include/spdk/log.h 00:03:16.820 TEST_HEADER include/spdk/lvol.h 00:03:16.820 TEST_HEADER include/spdk/likely.h 00:03:16.820 TEST_HEADER include/spdk/mmio.h 00:03:16.820 TEST_HEADER include/spdk/net.h 00:03:16.820 TEST_HEADER include/spdk/notify.h 00:03:16.820 TEST_HEADER include/spdk/md5.h 00:03:16.820 TEST_HEADER include/spdk/nvme.h 00:03:16.820 TEST_HEADER include/spdk/nbd.h 00:03:16.820 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:16.820 TEST_HEADER include/spdk/memory.h 00:03:16.820 TEST_HEADER include/spdk/nvme_zns.h 00:03:16.820 TEST_HEADER include/spdk/nvme_intel.h 00:03:16.820 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:16.820 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:16.820 TEST_HEADER include/spdk/nvme_spec.h 00:03:16.820 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:16.820 TEST_HEADER include/spdk/nvmf_spec.h 00:03:16.820 TEST_HEADER include/spdk/nvmf_transport.h 00:03:16.820 CC app/nvmf_tgt/nvmf_main.o 00:03:16.820 TEST_HEADER include/spdk/opal_spec.h 00:03:16.820 TEST_HEADER include/spdk/nvmf.h 00:03:16.820 TEST_HEADER include/spdk/opal.h 00:03:16.820 TEST_HEADER include/spdk/pci_ids.h 00:03:16.820 TEST_HEADER include/spdk/pipe.h 00:03:16.820 TEST_HEADER include/spdk/reduce.h 00:03:16.820 TEST_HEADER include/spdk/scheduler.h 00:03:16.820 TEST_HEADER include/spdk/queue.h 00:03:16.820 CC app/iscsi_tgt/iscsi_tgt.o 00:03:16.820 TEST_HEADER include/spdk/scsi_spec.h 00:03:16.820 TEST_HEADER include/spdk/sock.h 00:03:16.820 TEST_HEADER include/spdk/rpc.h 00:03:16.820 TEST_HEADER include/spdk/string.h 00:03:16.820 TEST_HEADER include/spdk/scsi.h 00:03:16.820 TEST_HEADER include/spdk/thread.h 00:03:16.820 TEST_HEADER include/spdk/stdinc.h 00:03:16.820 TEST_HEADER include/spdk/trace.h 00:03:16.820 TEST_HEADER include/spdk/ublk.h 00:03:16.820 TEST_HEADER include/spdk/util.h 00:03:16.820 TEST_HEADER include/spdk/trace_parser.h 00:03:16.820 TEST_HEADER include/spdk/tree.h 00:03:16.820 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:16.820 TEST_HEADER include/spdk/version.h 00:03:16.820 TEST_HEADER include/spdk/uuid.h 00:03:16.820 TEST_HEADER include/spdk/vhost.h 00:03:16.820 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:16.820 TEST_HEADER include/spdk/vmd.h 00:03:16.820 TEST_HEADER include/spdk/zipf.h 00:03:16.820 TEST_HEADER include/spdk/xor.h 00:03:16.820 CXX test/cpp_headers/accel_module.o 00:03:16.820 CXX test/cpp_headers/accel.o 00:03:16.820 CXX test/cpp_headers/assert.o 00:03:16.820 CXX test/cpp_headers/barrier.o 00:03:16.820 CXX test/cpp_headers/bdev.o 00:03:16.820 CXX test/cpp_headers/base64.o 00:03:16.820 CXX test/cpp_headers/bdev_zone.o 00:03:16.820 CXX test/cpp_headers/bit_array.o 00:03:16.820 CXX test/cpp_headers/bdev_module.o 00:03:16.820 CXX test/cpp_headers/blob_bdev.o 00:03:16.820 CXX test/cpp_headers/blobfs_bdev.o 00:03:16.820 CXX test/cpp_headers/bit_pool.o 00:03:16.820 CXX test/cpp_headers/blobfs.o 00:03:16.820 CXX test/cpp_headers/conf.o 00:03:16.820 CXX test/cpp_headers/blob.o 00:03:16.820 CXX test/cpp_headers/config.o 00:03:16.820 CXX test/cpp_headers/cpuset.o 00:03:16.820 CC app/spdk_tgt/spdk_tgt.o 00:03:16.820 CXX test/cpp_headers/crc16.o 00:03:16.820 CXX test/cpp_headers/dif.o 00:03:16.820 CC examples/util/zipf/zipf.o 00:03:16.820 CXX test/cpp_headers/crc32.o 00:03:16.820 CXX test/cpp_headers/crc64.o 00:03:16.820 CXX test/cpp_headers/endian.o 00:03:16.820 CXX test/cpp_headers/env.o 00:03:16.820 CXX test/cpp_headers/env_dpdk.o 00:03:16.820 CXX test/cpp_headers/dma.o 00:03:16.820 CXX test/cpp_headers/event.o 00:03:16.820 CXX test/cpp_headers/fd_group.o 00:03:16.820 CXX test/cpp_headers/fd.o 00:03:16.820 CXX test/cpp_headers/file.o 00:03:16.820 CXX test/cpp_headers/fsdev.o 00:03:16.820 CXX test/cpp_headers/ftl.o 00:03:16.820 CXX test/cpp_headers/fuse_dispatcher.o 00:03:16.820 CXX test/cpp_headers/fsdev_module.o 00:03:16.820 CXX test/cpp_headers/hexlify.o 00:03:16.820 CXX test/cpp_headers/gpt_spec.o 00:03:16.820 CXX test/cpp_headers/histogram_data.o 00:03:16.820 CXX test/cpp_headers/idxd.o 00:03:16.820 CXX test/cpp_headers/idxd_spec.o 00:03:16.820 CXX test/cpp_headers/ioat.o 00:03:16.820 CXX test/cpp_headers/init.o 00:03:16.820 CXX test/cpp_headers/ioat_spec.o 00:03:16.820 CXX test/cpp_headers/json.o 00:03:16.820 CXX test/cpp_headers/iscsi_spec.o 00:03:16.820 CXX test/cpp_headers/jsonrpc.o 00:03:16.820 CXX test/cpp_headers/keyring.o 00:03:16.820 CXX test/cpp_headers/keyring_module.o 00:03:16.820 CXX test/cpp_headers/likely.o 00:03:16.820 CXX test/cpp_headers/log.o 00:03:16.820 CXX test/cpp_headers/lvol.o 00:03:16.820 CXX test/cpp_headers/md5.o 00:03:16.820 CXX test/cpp_headers/memory.o 00:03:16.820 CXX test/cpp_headers/mmio.o 00:03:16.821 CXX test/cpp_headers/nbd.o 00:03:16.821 CXX test/cpp_headers/net.o 00:03:16.821 CC test/app/jsoncat/jsoncat.o 00:03:16.821 CXX test/cpp_headers/notify.o 00:03:16.821 CXX test/cpp_headers/nvme.o 00:03:16.821 CXX test/cpp_headers/nvme_intel.o 00:03:16.821 CXX test/cpp_headers/nvme_ocssd.o 00:03:16.821 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:16.821 CXX test/cpp_headers/nvme_spec.o 00:03:16.821 CXX test/cpp_headers/nvme_zns.o 00:03:16.821 CXX test/cpp_headers/nvmf_cmd.o 00:03:16.821 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:16.821 CXX test/cpp_headers/nvmf_spec.o 00:03:16.821 CXX test/cpp_headers/nvmf.o 00:03:16.821 CXX test/cpp_headers/opal.o 00:03:16.821 CXX test/cpp_headers/opal_spec.o 00:03:16.821 CXX test/cpp_headers/nvmf_transport.o 00:03:16.821 CXX test/cpp_headers/pci_ids.o 00:03:16.821 CXX test/cpp_headers/pipe.o 00:03:16.821 CXX test/cpp_headers/queue.o 00:03:16.821 CC test/app/stub/stub.o 00:03:16.821 CXX test/cpp_headers/reduce.o 00:03:16.821 CXX test/cpp_headers/rpc.o 00:03:16.821 CC test/thread/poller_perf/poller_perf.o 00:03:16.821 CC examples/ioat/verify/verify.o 00:03:16.821 CXX test/cpp_headers/scheduler.o 00:03:16.821 CXX test/cpp_headers/scsi.o 00:03:16.821 LINK spdk_lspci 00:03:16.821 CXX test/cpp_headers/scsi_spec.o 00:03:16.821 CC app/fio/nvme/fio_plugin.o 00:03:16.821 CC examples/ioat/perf/perf.o 00:03:16.821 CC test/app/histogram_perf/histogram_perf.o 00:03:16.821 CXX test/cpp_headers/sock.o 00:03:16.821 CXX test/cpp_headers/stdinc.o 00:03:16.821 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:16.821 CC test/env/memory/memory_ut.o 00:03:16.821 CC test/thread/lock/spdk_lock.o 00:03:16.821 CXX test/cpp_headers/string.o 00:03:16.821 CC test/dma/test_dma/test_dma.o 00:03:16.821 CC test/env/pci/pci_ut.o 00:03:16.821 CC test/env/vtophys/vtophys.o 00:03:16.821 CXX test/cpp_headers/thread.o 00:03:17.081 CC app/fio/bdev/fio_plugin.o 00:03:17.081 LINK rpc_client_test 00:03:17.081 CC test/app/bdev_svc/bdev_svc.o 00:03:17.081 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:17.081 LINK spdk_trace_record 00:03:17.081 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:17.081 LINK spdk_nvme_discover 00:03:17.081 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:17.081 LINK interrupt_tgt 00:03:17.081 CC test/env/mem_callbacks/mem_callbacks.o 00:03:17.081 CXX test/cpp_headers/trace.o 00:03:17.081 LINK zipf 00:03:17.081 LINK nvmf_tgt 00:03:17.081 CXX test/cpp_headers/trace_parser.o 00:03:17.081 CXX test/cpp_headers/tree.o 00:03:17.081 LINK jsoncat 00:03:17.081 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:17.081 CXX test/cpp_headers/ublk.o 00:03:17.081 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:17.081 CXX test/cpp_headers/util.o 00:03:17.081 CXX test/cpp_headers/uuid.o 00:03:17.081 CXX test/cpp_headers/version.o 00:03:17.081 CXX test/cpp_headers/vfio_user_pci.o 00:03:17.081 CXX test/cpp_headers/vfio_user_spec.o 00:03:17.081 CXX test/cpp_headers/vhost.o 00:03:17.081 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:17.081 CXX test/cpp_headers/vmd.o 00:03:17.081 CXX test/cpp_headers/xor.o 00:03:17.081 CXX test/cpp_headers/zipf.o 00:03:17.081 LINK poller_perf 00:03:17.081 LINK histogram_perf 00:03:17.081 LINK stub 00:03:17.081 LINK iscsi_tgt 00:03:17.081 LINK env_dpdk_post_init 00:03:17.081 LINK vtophys 00:03:17.081 LINK spdk_tgt 00:03:17.081 LINK verify 00:03:17.081 LINK ioat_perf 00:03:17.339 LINK bdev_svc 00:03:17.339 LINK spdk_trace 00:03:17.339 LINK spdk_dd 00:03:17.339 LINK llvm_vfio_fuzz 00:03:17.339 LINK nvme_fuzz 00:03:17.339 LINK spdk_nvme_identify 00:03:17.339 LINK test_dma 00:03:17.339 LINK vhost_fuzz 00:03:17.339 LINK pci_ut 00:03:17.339 LINK spdk_nvme 00:03:17.339 LINK spdk_bdev 00:03:17.339 LINK spdk_nvme_perf 00:03:17.596 LINK mem_callbacks 00:03:17.596 LINK spdk_top 00:03:17.596 LINK llvm_nvme_fuzz 00:03:17.855 CC examples/vmd/led/led.o 00:03:17.855 CC app/vhost/vhost.o 00:03:17.855 CC examples/vmd/lsvmd/lsvmd.o 00:03:17.855 CC examples/sock/hello_world/hello_sock.o 00:03:17.855 CC examples/idxd/perf/perf.o 00:03:17.855 CC examples/thread/thread/thread_ex.o 00:03:17.855 LINK memory_ut 00:03:17.855 LINK led 00:03:17.855 LINK lsvmd 00:03:17.855 LINK vhost 00:03:17.855 LINK hello_sock 00:03:18.114 LINK idxd_perf 00:03:18.114 LINK spdk_lock 00:03:18.114 LINK thread 00:03:18.114 LINK iscsi_fuzz 00:03:18.682 CC examples/nvme/reconnect/reconnect.o 00:03:18.682 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:18.682 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:18.682 CC examples/nvme/hello_world/hello_world.o 00:03:18.682 CC examples/nvme/arbitration/arbitration.o 00:03:18.682 CC examples/nvme/abort/abort.o 00:03:18.682 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:18.682 CC test/event/event_perf/event_perf.o 00:03:18.682 CC test/event/reactor_perf/reactor_perf.o 00:03:18.682 CC test/event/reactor/reactor.o 00:03:18.682 CC examples/nvme/hotplug/hotplug.o 00:03:18.682 CC test/event/app_repeat/app_repeat.o 00:03:18.682 CC test/event/scheduler/scheduler.o 00:03:18.682 LINK reactor 00:03:18.682 LINK event_perf 00:03:18.941 LINK reactor_perf 00:03:18.941 LINK pmr_persistence 00:03:18.941 LINK cmb_copy 00:03:18.941 LINK hello_world 00:03:18.941 LINK app_repeat 00:03:18.941 LINK hotplug 00:03:18.941 LINK reconnect 00:03:18.941 LINK arbitration 00:03:18.941 LINK scheduler 00:03:18.941 LINK abort 00:03:18.941 LINK nvme_manage 00:03:18.941 CC test/nvme/reserve/reserve.o 00:03:18.941 CC test/nvme/aer/aer.o 00:03:18.941 CC test/nvme/fdp/fdp.o 00:03:18.941 CC test/nvme/cuse/cuse.o 00:03:18.941 CC test/nvme/reset/reset.o 00:03:18.941 CC test/nvme/compliance/nvme_compliance.o 00:03:18.941 CC test/nvme/connect_stress/connect_stress.o 00:03:18.941 CC test/nvme/startup/startup.o 00:03:18.941 CC test/nvme/boot_partition/boot_partition.o 00:03:18.941 CC test/nvme/sgl/sgl.o 00:03:18.941 CC test/nvme/fused_ordering/fused_ordering.o 00:03:18.941 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:18.941 CC test/nvme/e2edp/nvme_dp.o 00:03:18.941 CC test/nvme/simple_copy/simple_copy.o 00:03:18.941 CC test/nvme/err_injection/err_injection.o 00:03:18.941 CC test/nvme/overhead/overhead.o 00:03:18.941 CC test/accel/dif/dif.o 00:03:19.200 CC test/blobfs/mkfs/mkfs.o 00:03:19.200 LINK reserve 00:03:19.200 CC test/lvol/esnap/esnap.o 00:03:19.200 LINK startup 00:03:19.200 LINK connect_stress 00:03:19.200 LINK boot_partition 00:03:19.200 LINK doorbell_aers 00:03:19.200 LINK fused_ordering 00:03:19.200 LINK aer 00:03:19.200 LINK err_injection 00:03:19.200 LINK simple_copy 00:03:19.200 LINK reset 00:03:19.200 LINK fdp 00:03:19.200 LINK nvme_dp 00:03:19.200 LINK sgl 00:03:19.200 LINK mkfs 00:03:19.200 LINK overhead 00:03:19.460 LINK nvme_compliance 00:03:19.460 LINK dif 00:03:19.720 CC examples/blob/cli/blobcli.o 00:03:19.720 CC examples/accel/perf/accel_perf.o 00:03:19.720 CC examples/blob/hello_world/hello_blob.o 00:03:19.720 CC examples/fsdev/hello_world/hello_fsdev.o 00:03:19.979 LINK cuse 00:03:19.979 LINK hello_blob 00:03:19.979 LINK hello_fsdev 00:03:19.979 LINK accel_perf 00:03:19.979 LINK blobcli 00:03:20.917 CC examples/bdev/hello_world/hello_bdev.o 00:03:20.917 CC examples/bdev/bdevperf/bdevperf.o 00:03:20.917 LINK hello_bdev 00:03:21.176 CC test/bdev/bdevio/bdevio.o 00:03:21.176 LINK bdevperf 00:03:21.434 LINK bdevio 00:03:22.812 LINK esnap 00:03:22.812 CC examples/nvmf/nvmf/nvmf.o 00:03:23.071 LINK nvmf 00:03:24.448 00:03:24.448 real 0m37.084s 00:03:24.448 user 4m40.508s 00:03:24.448 sys 1m44.106s 00:03:24.448 15:40:32 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:03:24.448 15:40:32 make -- common/autotest_common.sh@10 -- $ set +x 00:03:24.448 ************************************ 00:03:24.448 END TEST make 00:03:24.448 ************************************ 00:03:24.448 15:40:32 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:24.448 15:40:32 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:24.448 15:40:32 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:24.448 15:40:32 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.448 15:40:32 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:24.448 15:40:32 -- pm/common@44 -- $ pid=1565578 00:03:24.448 15:40:32 -- pm/common@50 -- $ kill -TERM 1565578 00:03:24.448 15:40:32 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.448 15:40:32 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:24.448 15:40:32 -- pm/common@44 -- $ pid=1565580 00:03:24.448 15:40:32 -- pm/common@50 -- $ kill -TERM 1565580 00:03:24.448 15:40:32 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.448 15:40:32 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:24.448 15:40:32 -- pm/common@44 -- $ pid=1565582 00:03:24.448 15:40:32 -- pm/common@50 -- $ kill -TERM 1565582 00:03:24.448 15:40:32 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.448 15:40:32 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:24.448 15:40:32 -- pm/common@44 -- $ pid=1565604 00:03:24.448 15:40:32 -- pm/common@50 -- $ sudo -E kill -TERM 1565604 00:03:24.448 15:40:32 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:03:24.448 15:40:32 -- spdk/autorun.sh@27 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:03:24.448 15:40:32 -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:03:24.448 15:40:32 -- common/autotest_common.sh@1693 -- # lcov --version 00:03:24.448 15:40:32 -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:03:24.448 15:40:32 -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:03:24.448 15:40:32 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:24.448 15:40:32 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:24.448 15:40:32 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:24.448 15:40:32 -- scripts/common.sh@336 -- # IFS=.-: 00:03:24.448 15:40:32 -- scripts/common.sh@336 -- # read -ra ver1 00:03:24.448 15:40:32 -- scripts/common.sh@337 -- # IFS=.-: 00:03:24.448 15:40:32 -- scripts/common.sh@337 -- # read -ra ver2 00:03:24.448 15:40:32 -- scripts/common.sh@338 -- # local 'op=<' 00:03:24.448 15:40:32 -- scripts/common.sh@340 -- # ver1_l=2 00:03:24.448 15:40:32 -- scripts/common.sh@341 -- # ver2_l=1 00:03:24.448 15:40:32 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:24.448 15:40:32 -- scripts/common.sh@344 -- # case "$op" in 00:03:24.448 15:40:32 -- scripts/common.sh@345 -- # : 1 00:03:24.448 15:40:32 -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:24.448 15:40:32 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:24.448 15:40:32 -- scripts/common.sh@365 -- # decimal 1 00:03:24.448 15:40:32 -- scripts/common.sh@353 -- # local d=1 00:03:24.448 15:40:32 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:24.448 15:40:32 -- scripts/common.sh@355 -- # echo 1 00:03:24.448 15:40:32 -- scripts/common.sh@365 -- # ver1[v]=1 00:03:24.448 15:40:32 -- scripts/common.sh@366 -- # decimal 2 00:03:24.448 15:40:32 -- scripts/common.sh@353 -- # local d=2 00:03:24.448 15:40:32 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:24.448 15:40:32 -- scripts/common.sh@355 -- # echo 2 00:03:24.448 15:40:32 -- scripts/common.sh@366 -- # ver2[v]=2 00:03:24.448 15:40:32 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:24.708 15:40:32 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:24.708 15:40:32 -- scripts/common.sh@368 -- # return 0 00:03:24.708 15:40:32 -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:24.708 15:40:32 -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:03:24.708 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:24.708 --rc genhtml_branch_coverage=1 00:03:24.708 --rc genhtml_function_coverage=1 00:03:24.708 --rc genhtml_legend=1 00:03:24.708 --rc geninfo_all_blocks=1 00:03:24.708 --rc geninfo_unexecuted_blocks=1 00:03:24.708 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:24.708 ' 00:03:24.708 15:40:32 -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:03:24.708 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:24.708 --rc genhtml_branch_coverage=1 00:03:24.708 --rc genhtml_function_coverage=1 00:03:24.708 --rc genhtml_legend=1 00:03:24.708 --rc geninfo_all_blocks=1 00:03:24.708 --rc geninfo_unexecuted_blocks=1 00:03:24.708 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:24.708 ' 00:03:24.708 15:40:32 -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:03:24.708 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:24.708 --rc genhtml_branch_coverage=1 00:03:24.708 --rc genhtml_function_coverage=1 00:03:24.708 --rc genhtml_legend=1 00:03:24.708 --rc geninfo_all_blocks=1 00:03:24.708 --rc geninfo_unexecuted_blocks=1 00:03:24.708 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:24.708 ' 00:03:24.708 15:40:32 -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:03:24.708 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:24.708 --rc genhtml_branch_coverage=1 00:03:24.708 --rc genhtml_function_coverage=1 00:03:24.708 --rc genhtml_legend=1 00:03:24.708 --rc geninfo_all_blocks=1 00:03:24.708 --rc geninfo_unexecuted_blocks=1 00:03:24.708 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:24.708 ' 00:03:24.708 15:40:32 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:24.708 15:40:32 -- nvmf/common.sh@7 -- # uname -s 00:03:24.708 15:40:32 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:24.708 15:40:32 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:24.708 15:40:32 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:24.708 15:40:32 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:24.708 15:40:32 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:24.708 15:40:32 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:24.708 15:40:32 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:24.708 15:40:32 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:24.708 15:40:32 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:24.708 15:40:32 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:24.708 15:40:32 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:24.708 15:40:32 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:24.708 15:40:32 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:24.708 15:40:32 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:24.708 15:40:32 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:24.708 15:40:32 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:24.708 15:40:32 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:24.708 15:40:32 -- scripts/common.sh@15 -- # shopt -s extglob 00:03:24.708 15:40:32 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:24.708 15:40:32 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:24.708 15:40:32 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:24.708 15:40:32 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:24.708 15:40:32 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:24.708 15:40:32 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:24.708 15:40:32 -- paths/export.sh@5 -- # export PATH 00:03:24.708 15:40:32 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:24.708 15:40:32 -- nvmf/common.sh@51 -- # : 0 00:03:24.708 15:40:32 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:03:24.708 15:40:32 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:03:24.708 15:40:32 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:24.708 15:40:32 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:24.708 15:40:32 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:24.708 15:40:32 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:03:24.708 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:03:24.708 15:40:32 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:03:24.708 15:40:32 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:03:24.708 15:40:32 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:03:24.708 15:40:32 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:24.708 15:40:32 -- spdk/autotest.sh@32 -- # uname -s 00:03:24.708 15:40:32 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:24.709 15:40:32 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:24.709 15:40:32 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:24.709 15:40:32 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:24.709 15:40:32 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:24.709 15:40:32 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:24.709 15:40:32 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:24.709 15:40:32 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:24.709 15:40:32 -- spdk/autotest.sh@48 -- # udevadm_pid=1646234 00:03:24.709 15:40:32 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:24.709 15:40:32 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:24.709 15:40:32 -- pm/common@17 -- # local monitor 00:03:24.709 15:40:32 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.709 15:40:32 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.709 15:40:32 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.709 15:40:32 -- pm/common@21 -- # date +%s 00:03:24.709 15:40:32 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.709 15:40:32 -- pm/common@21 -- # date +%s 00:03:24.709 15:40:32 -- pm/common@25 -- # sleep 1 00:03:24.709 15:40:32 -- pm/common@21 -- # date +%s 00:03:24.709 15:40:32 -- pm/common@21 -- # date +%s 00:03:24.709 15:40:32 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732977632 00:03:24.709 15:40:32 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732977632 00:03:24.709 15:40:32 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732977632 00:03:24.709 15:40:32 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732977632 00:03:24.709 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732977632_collect-vmstat.pm.log 00:03:24.709 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732977632_collect-cpu-load.pm.log 00:03:24.709 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732977632_collect-cpu-temp.pm.log 00:03:24.709 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732977632_collect-bmc-pm.bmc.pm.log 00:03:25.647 15:40:33 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:25.647 15:40:33 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:25.647 15:40:33 -- common/autotest_common.sh@726 -- # xtrace_disable 00:03:25.647 15:40:33 -- common/autotest_common.sh@10 -- # set +x 00:03:25.647 15:40:33 -- spdk/autotest.sh@59 -- # create_test_list 00:03:25.647 15:40:33 -- common/autotest_common.sh@752 -- # xtrace_disable 00:03:25.647 15:40:33 -- common/autotest_common.sh@10 -- # set +x 00:03:25.647 15:40:33 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:25.647 15:40:33 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:25.647 15:40:33 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:25.647 15:40:33 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:25.647 15:40:33 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:25.647 15:40:33 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:25.647 15:40:33 -- common/autotest_common.sh@1457 -- # uname 00:03:25.647 15:40:33 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:03:25.647 15:40:33 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:25.647 15:40:33 -- common/autotest_common.sh@1477 -- # uname 00:03:25.647 15:40:33 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:03:25.647 15:40:33 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:03:25.647 15:40:33 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:25.647 lcov: LCOV version 1.15 00:03:25.647 15:40:33 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:33.773 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:33.773 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:03:41.898 15:40:49 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:03:41.898 15:40:49 -- common/autotest_common.sh@726 -- # xtrace_disable 00:03:41.898 15:40:49 -- common/autotest_common.sh@10 -- # set +x 00:03:41.898 15:40:49 -- spdk/autotest.sh@78 -- # rm -f 00:03:41.898 15:40:49 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:44.435 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:44.435 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:44.435 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:44.693 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:44.952 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:44.952 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:44.952 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:44.952 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:44.952 15:40:52 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:03:44.952 15:40:52 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:03:44.952 15:40:52 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:03:44.952 15:40:52 -- common/autotest_common.sh@1658 -- # local nvme bdf 00:03:44.952 15:40:52 -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:03:44.952 15:40:52 -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:03:44.952 15:40:52 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:03:44.952 15:40:52 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:44.952 15:40:52 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:03:44.952 15:40:52 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:03:44.952 15:40:52 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:03:44.952 15:40:52 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:03:44.952 15:40:52 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:03:44.952 15:40:52 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:03:44.952 15:40:52 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:44.952 No valid GPT data, bailing 00:03:44.952 15:40:52 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:44.952 15:40:52 -- scripts/common.sh@394 -- # pt= 00:03:44.952 15:40:52 -- scripts/common.sh@395 -- # return 1 00:03:44.952 15:40:52 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:44.952 1+0 records in 00:03:44.952 1+0 records out 00:03:44.952 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0046759 s, 224 MB/s 00:03:44.952 15:40:52 -- spdk/autotest.sh@105 -- # sync 00:03:44.952 15:40:52 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:44.952 15:40:52 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:44.952 15:40:52 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:53.076 15:41:00 -- spdk/autotest.sh@111 -- # uname -s 00:03:53.076 15:41:00 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:03:53.076 15:41:00 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:03:53.076 15:41:00 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:53.076 15:41:00 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:03:53.076 15:41:00 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:03:53.076 15:41:00 -- common/autotest_common.sh@10 -- # set +x 00:03:53.076 ************************************ 00:03:53.076 START TEST setup.sh 00:03:53.076 ************************************ 00:03:53.076 15:41:00 setup.sh -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:53.076 * Looking for test storage... 00:03:53.076 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:53.076 15:41:00 setup.sh -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:03:53.076 15:41:00 setup.sh -- common/autotest_common.sh@1693 -- # lcov --version 00:03:53.076 15:41:00 setup.sh -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:03:53.076 15:41:00 setup.sh -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@345 -- # : 1 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@353 -- # local d=1 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@355 -- # echo 1 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:03:53.076 15:41:00 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:03:53.077 15:41:00 setup.sh -- scripts/common.sh@353 -- # local d=2 00:03:53.077 15:41:00 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:53.077 15:41:00 setup.sh -- scripts/common.sh@355 -- # echo 2 00:03:53.077 15:41:00 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:03:53.077 15:41:00 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:53.077 15:41:00 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:53.077 15:41:00 setup.sh -- scripts/common.sh@368 -- # return 0 00:03:53.077 15:41:00 setup.sh -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:53.077 15:41:00 setup.sh -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:03:53.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.077 --rc genhtml_branch_coverage=1 00:03:53.077 --rc genhtml_function_coverage=1 00:03:53.077 --rc genhtml_legend=1 00:03:53.077 --rc geninfo_all_blocks=1 00:03:53.077 --rc geninfo_unexecuted_blocks=1 00:03:53.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:53.077 ' 00:03:53.077 15:41:00 setup.sh -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:03:53.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.077 --rc genhtml_branch_coverage=1 00:03:53.077 --rc genhtml_function_coverage=1 00:03:53.077 --rc genhtml_legend=1 00:03:53.077 --rc geninfo_all_blocks=1 00:03:53.077 --rc geninfo_unexecuted_blocks=1 00:03:53.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:53.077 ' 00:03:53.077 15:41:00 setup.sh -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:03:53.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.077 --rc genhtml_branch_coverage=1 00:03:53.077 --rc genhtml_function_coverage=1 00:03:53.077 --rc genhtml_legend=1 00:03:53.077 --rc geninfo_all_blocks=1 00:03:53.077 --rc geninfo_unexecuted_blocks=1 00:03:53.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:53.077 ' 00:03:53.077 15:41:00 setup.sh -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:03:53.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.077 --rc genhtml_branch_coverage=1 00:03:53.077 --rc genhtml_function_coverage=1 00:03:53.077 --rc genhtml_legend=1 00:03:53.077 --rc geninfo_all_blocks=1 00:03:53.077 --rc geninfo_unexecuted_blocks=1 00:03:53.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:53.077 ' 00:03:53.077 15:41:00 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:53.077 15:41:00 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:53.077 15:41:00 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:53.077 15:41:00 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:03:53.077 15:41:00 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:03:53.077 15:41:00 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:53.077 ************************************ 00:03:53.077 START TEST acl 00:03:53.077 ************************************ 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:53.077 * Looking for test storage... 00:03:53.077 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1693 -- # lcov --version 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:53.077 15:41:00 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:03:53.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.077 --rc genhtml_branch_coverage=1 00:03:53.077 --rc genhtml_function_coverage=1 00:03:53.077 --rc genhtml_legend=1 00:03:53.077 --rc geninfo_all_blocks=1 00:03:53.077 --rc geninfo_unexecuted_blocks=1 00:03:53.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:53.077 ' 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:03:53.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.077 --rc genhtml_branch_coverage=1 00:03:53.077 --rc genhtml_function_coverage=1 00:03:53.077 --rc genhtml_legend=1 00:03:53.077 --rc geninfo_all_blocks=1 00:03:53.077 --rc geninfo_unexecuted_blocks=1 00:03:53.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:53.077 ' 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:03:53.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.077 --rc genhtml_branch_coverage=1 00:03:53.077 --rc genhtml_function_coverage=1 00:03:53.077 --rc genhtml_legend=1 00:03:53.077 --rc geninfo_all_blocks=1 00:03:53.077 --rc geninfo_unexecuted_blocks=1 00:03:53.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:53.077 ' 00:03:53.077 15:41:00 setup.sh.acl -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:03:53.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.077 --rc genhtml_branch_coverage=1 00:03:53.077 --rc genhtml_function_coverage=1 00:03:53.077 --rc genhtml_legend=1 00:03:53.077 --rc geninfo_all_blocks=1 00:03:53.078 --rc geninfo_unexecuted_blocks=1 00:03:53.078 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:53.078 ' 00:03:53.078 15:41:00 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:53.078 15:41:00 setup.sh.acl -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:03:53.078 15:41:00 setup.sh.acl -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:03:53.078 15:41:00 setup.sh.acl -- common/autotest_common.sh@1658 -- # local nvme bdf 00:03:53.078 15:41:00 setup.sh.acl -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:03:53.078 15:41:00 setup.sh.acl -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:03:53.078 15:41:00 setup.sh.acl -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:03:53.078 15:41:00 setup.sh.acl -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:53.078 15:41:00 setup.sh.acl -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:03:53.078 15:41:00 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:53.078 15:41:00 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:53.078 15:41:00 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:53.078 15:41:00 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:53.078 15:41:00 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:53.078 15:41:00 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:53.078 15:41:00 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:57.269 15:41:04 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:57.269 15:41:04 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:57.269 15:41:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:57.269 15:41:04 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:57.269 15:41:04 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:57.269 15:41:04 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:59.804 Hugepages 00:03:59.804 node hugesize free / total 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 00:03:59.804 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:59.804 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:00.063 15:41:07 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:00.063 15:41:07 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:00.063 15:41:07 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:00.063 15:41:07 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:00.063 ************************************ 00:04:00.063 START TEST denied 00:04:00.063 ************************************ 00:04:00.064 15:41:07 setup.sh.acl.denied -- common/autotest_common.sh@1129 -- # denied 00:04:00.064 15:41:07 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:00.064 15:41:07 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:00.064 15:41:07 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.064 15:41:07 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:00.064 15:41:07 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:04.254 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:04.254 15:41:11 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:04.254 15:41:11 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:04.254 15:41:11 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:04.254 15:41:11 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:04.255 15:41:11 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:04.255 15:41:11 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:04.255 15:41:11 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:04.255 15:41:11 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:04.255 15:41:11 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:04.255 15:41:11 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:08.444 00:04:08.444 real 0m7.722s 00:04:08.444 user 0m2.206s 00:04:08.444 sys 0m4.800s 00:04:08.444 15:41:15 setup.sh.acl.denied -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:08.444 15:41:15 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:08.444 ************************************ 00:04:08.444 END TEST denied 00:04:08.444 ************************************ 00:04:08.444 15:41:15 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:08.444 15:41:15 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:08.444 15:41:15 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:08.444 15:41:15 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:08.444 ************************************ 00:04:08.444 START TEST allowed 00:04:08.444 ************************************ 00:04:08.444 15:41:15 setup.sh.acl.allowed -- common/autotest_common.sh@1129 -- # allowed 00:04:08.444 15:41:15 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:08.444 15:41:15 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:08.444 15:41:15 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:08.444 15:41:15 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.444 15:41:15 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:12.727 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:12.727 15:41:20 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:12.727 15:41:20 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:12.727 15:41:20 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:12.727 15:41:20 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:12.727 15:41:20 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:16.918 00:04:16.918 real 0m8.483s 00:04:16.918 user 0m2.246s 00:04:16.918 sys 0m4.703s 00:04:16.918 15:41:24 setup.sh.acl.allowed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:16.918 15:41:24 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:16.918 ************************************ 00:04:16.918 END TEST allowed 00:04:16.918 ************************************ 00:04:16.918 00:04:16.918 real 0m23.883s 00:04:16.918 user 0m7.209s 00:04:16.918 sys 0m14.698s 00:04:16.918 15:41:24 setup.sh.acl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:16.918 15:41:24 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:16.918 ************************************ 00:04:16.918 END TEST acl 00:04:16.918 ************************************ 00:04:16.918 15:41:24 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:16.918 15:41:24 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:16.918 15:41:24 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:16.918 15:41:24 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:16.918 ************************************ 00:04:16.918 START TEST hugepages 00:04:16.918 ************************************ 00:04:16.918 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:16.918 * Looking for test storage... 00:04:16.918 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:16.918 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:16.918 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lcov --version 00:04:16.918 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:16.918 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:16.918 15:41:24 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:04:16.918 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:16.918 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:16.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.918 --rc genhtml_branch_coverage=1 00:04:16.918 --rc genhtml_function_coverage=1 00:04:16.918 --rc genhtml_legend=1 00:04:16.918 --rc geninfo_all_blocks=1 00:04:16.918 --rc geninfo_unexecuted_blocks=1 00:04:16.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:16.918 ' 00:04:16.918 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:16.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.918 --rc genhtml_branch_coverage=1 00:04:16.918 --rc genhtml_function_coverage=1 00:04:16.918 --rc genhtml_legend=1 00:04:16.918 --rc geninfo_all_blocks=1 00:04:16.918 --rc geninfo_unexecuted_blocks=1 00:04:16.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:16.919 ' 00:04:16.919 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:16.919 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.919 --rc genhtml_branch_coverage=1 00:04:16.919 --rc genhtml_function_coverage=1 00:04:16.919 --rc genhtml_legend=1 00:04:16.919 --rc geninfo_all_blocks=1 00:04:16.919 --rc geninfo_unexecuted_blocks=1 00:04:16.919 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:16.919 ' 00:04:16.919 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:16.919 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.919 --rc genhtml_branch_coverage=1 00:04:16.919 --rc genhtml_function_coverage=1 00:04:16.919 --rc genhtml_legend=1 00:04:16.919 --rc geninfo_all_blocks=1 00:04:16.919 --rc geninfo_unexecuted_blocks=1 00:04:16.919 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:16.919 ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 39277372 kB' 'MemAvailable: 40926592 kB' 'Buffers: 6816 kB' 'Cached: 11402336 kB' 'SwapCached: 276 kB' 'Active: 8881732 kB' 'Inactive: 3144536 kB' 'Active(anon): 7973876 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620288 kB' 'Mapped: 130944 kB' 'Shmem: 9660072 kB' 'KReclaimable: 596940 kB' 'Slab: 1603988 kB' 'SReclaimable: 596940 kB' 'SUnreclaim: 1007048 kB' 'KernelStack: 21904 kB' 'PageTables: 8756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 12224276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217988 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.919 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.920 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:16.921 15:41:24 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:04:16.921 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:16.921 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:16.921 15:41:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:16.921 ************************************ 00:04:16.921 START TEST single_node_setup 00:04:16.921 ************************************ 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1129 -- # single_node_setup 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.921 15:41:24 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:20.205 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:20.205 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:22.118 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41471608 kB' 'MemAvailable: 43120804 kB' 'Buffers: 6816 kB' 'Cached: 11402480 kB' 'SwapCached: 276 kB' 'Active: 8883608 kB' 'Inactive: 3144536 kB' 'Active(anon): 7975752 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621944 kB' 'Mapped: 131060 kB' 'Shmem: 9660216 kB' 'KReclaimable: 596916 kB' 'Slab: 1602240 kB' 'SReclaimable: 596916 kB' 'SUnreclaim: 1005324 kB' 'KernelStack: 21984 kB' 'PageTables: 8612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12225196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217908 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.118 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:22.119 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41476384 kB' 'MemAvailable: 43125580 kB' 'Buffers: 6816 kB' 'Cached: 11402484 kB' 'SwapCached: 276 kB' 'Active: 8884508 kB' 'Inactive: 3144536 kB' 'Active(anon): 7976652 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 622408 kB' 'Mapped: 131544 kB' 'Shmem: 9660220 kB' 'KReclaimable: 596916 kB' 'Slab: 1602292 kB' 'SReclaimable: 596916 kB' 'SUnreclaim: 1005376 kB' 'KernelStack: 22000 kB' 'PageTables: 8676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12226440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217924 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.120 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.121 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41469988 kB' 'MemAvailable: 43119184 kB' 'Buffers: 6816 kB' 'Cached: 11402500 kB' 'SwapCached: 276 kB' 'Active: 8888456 kB' 'Inactive: 3144536 kB' 'Active(anon): 7980600 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 627232 kB' 'Mapped: 131804 kB' 'Shmem: 9660236 kB' 'KReclaimable: 596916 kB' 'Slab: 1602292 kB' 'SReclaimable: 596916 kB' 'SUnreclaim: 1005376 kB' 'KernelStack: 21920 kB' 'PageTables: 8664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12231356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217960 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.122 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.123 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:22.124 nr_hugepages=1024 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:22.124 resv_hugepages=0 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:22.124 surplus_hugepages=0 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:22.124 anon_hugepages=0 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41469980 kB' 'MemAvailable: 43119176 kB' 'Buffers: 6816 kB' 'Cached: 11402540 kB' 'SwapCached: 276 kB' 'Active: 8884348 kB' 'Inactive: 3144536 kB' 'Active(anon): 7976492 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 622628 kB' 'Mapped: 131804 kB' 'Shmem: 9660276 kB' 'KReclaimable: 596916 kB' 'Slab: 1602292 kB' 'SReclaimable: 596916 kB' 'SUnreclaim: 1005376 kB' 'KernelStack: 22096 kB' 'PageTables: 9048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12226668 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.124 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.125 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:22.126 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22186028 kB' 'MemUsed: 10448408 kB' 'SwapCached: 176 kB' 'Active: 5474700 kB' 'Inactive: 535948 kB' 'Active(anon): 4696760 kB' 'Inactive(anon): 744 kB' 'Active(file): 777940 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5730520 kB' 'Mapped: 74620 kB' 'AnonPages: 283412 kB' 'Shmem: 4417200 kB' 'KernelStack: 11288 kB' 'PageTables: 4588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 410628 kB' 'Slab: 896136 kB' 'SReclaimable: 410628 kB' 'SUnreclaim: 485508 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.127 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:22.128 node0=1024 expecting 1024 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:22.128 00:04:22.128 real 0m5.273s 00:04:22.128 user 0m1.413s 00:04:22.128 sys 0m2.367s 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:22.128 15:41:29 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:04:22.128 ************************************ 00:04:22.128 END TEST single_node_setup 00:04:22.128 ************************************ 00:04:22.128 15:41:29 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:04:22.128 15:41:29 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:22.128 15:41:29 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:22.128 15:41:29 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:22.128 ************************************ 00:04:22.128 START TEST even_2G_alloc 00:04:22.128 ************************************ 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1129 -- # even_2G_alloc 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:22.128 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.129 15:41:30 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:25.420 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:25.420 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:25.420 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:25.420 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:25.420 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:25.420 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:25.420 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:25.420 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:25.420 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:25.683 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:25.683 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:25.683 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:25.683 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:25.683 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:25.683 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:25.683 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:25.683 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41462100 kB' 'MemAvailable: 43111280 kB' 'Buffers: 6816 kB' 'Cached: 11402644 kB' 'SwapCached: 276 kB' 'Active: 8883032 kB' 'Inactive: 3144536 kB' 'Active(anon): 7975176 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620604 kB' 'Mapped: 129988 kB' 'Shmem: 9660380 kB' 'KReclaimable: 596900 kB' 'Slab: 1602028 kB' 'SReclaimable: 596900 kB' 'SUnreclaim: 1005128 kB' 'KernelStack: 21888 kB' 'PageTables: 8484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12215876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218116 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.683 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.684 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41463032 kB' 'MemAvailable: 43112212 kB' 'Buffers: 6816 kB' 'Cached: 11402648 kB' 'SwapCached: 276 kB' 'Active: 8882764 kB' 'Inactive: 3144536 kB' 'Active(anon): 7974908 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620856 kB' 'Mapped: 129892 kB' 'Shmem: 9660384 kB' 'KReclaimable: 596900 kB' 'Slab: 1602012 kB' 'SReclaimable: 596900 kB' 'SUnreclaim: 1005112 kB' 'KernelStack: 21888 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12215896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.685 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.686 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41463032 kB' 'MemAvailable: 43112212 kB' 'Buffers: 6816 kB' 'Cached: 11402648 kB' 'SwapCached: 276 kB' 'Active: 8882804 kB' 'Inactive: 3144536 kB' 'Active(anon): 7974948 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620884 kB' 'Mapped: 129892 kB' 'Shmem: 9660384 kB' 'KReclaimable: 596900 kB' 'Slab: 1602012 kB' 'SReclaimable: 596900 kB' 'SUnreclaim: 1005112 kB' 'KernelStack: 21904 kB' 'PageTables: 8540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12215916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.687 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.688 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:25.689 nr_hugepages=1024 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:25.689 resv_hugepages=0 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:25.689 surplus_hugepages=0 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:25.689 anon_hugepages=0 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.689 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41463320 kB' 'MemAvailable: 43112500 kB' 'Buffers: 6816 kB' 'Cached: 11402704 kB' 'SwapCached: 276 kB' 'Active: 8882444 kB' 'Inactive: 3144536 kB' 'Active(anon): 7974588 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 620432 kB' 'Mapped: 129892 kB' 'Shmem: 9660440 kB' 'KReclaimable: 596900 kB' 'Slab: 1602012 kB' 'SReclaimable: 596900 kB' 'SUnreclaim: 1005112 kB' 'KernelStack: 21872 kB' 'PageTables: 8432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12215936 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.690 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.952 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.953 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23231096 kB' 'MemUsed: 9403340 kB' 'SwapCached: 176 kB' 'Active: 5476540 kB' 'Inactive: 535948 kB' 'Active(anon): 4698600 kB' 'Inactive(anon): 744 kB' 'Active(file): 777940 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5730620 kB' 'Mapped: 73208 kB' 'AnonPages: 285080 kB' 'Shmem: 4417300 kB' 'KernelStack: 11304 kB' 'PageTables: 4624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 410628 kB' 'Slab: 896088 kB' 'SReclaimable: 410628 kB' 'SUnreclaim: 485460 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.954 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.955 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18235500 kB' 'MemUsed: 9413860 kB' 'SwapCached: 100 kB' 'Active: 3406344 kB' 'Inactive: 2608588 kB' 'Active(anon): 3276428 kB' 'Inactive(anon): 2302568 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5679200 kB' 'Mapped: 56684 kB' 'AnonPages: 335784 kB' 'Shmem: 5243164 kB' 'KernelStack: 10584 kB' 'PageTables: 3864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186272 kB' 'Slab: 705924 kB' 'SReclaimable: 186272 kB' 'SUnreclaim: 519652 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.956 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:04:25.957 node0=512 expecting 512 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:04:25.957 node1=512 expecting 512 00:04:25.957 15:41:33 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:04:25.957 00:04:25.957 real 0m3.709s 00:04:25.957 user 0m1.412s 00:04:25.958 sys 0m2.366s 00:04:25.958 15:41:33 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:25.958 15:41:33 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:25.958 ************************************ 00:04:25.958 END TEST even_2G_alloc 00:04:25.958 ************************************ 00:04:25.958 15:41:33 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:04:25.958 15:41:33 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:25.958 15:41:33 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:25.958 15:41:33 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:25.958 ************************************ 00:04:25.958 START TEST odd_alloc 00:04:25.958 ************************************ 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1129 -- # odd_alloc 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:25.958 15:41:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:29.245 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:29.245 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.245 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41490552 kB' 'MemAvailable: 43139732 kB' 'Buffers: 6816 kB' 'Cached: 11402812 kB' 'SwapCached: 276 kB' 'Active: 8884124 kB' 'Inactive: 3144536 kB' 'Active(anon): 7976268 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621640 kB' 'Mapped: 130020 kB' 'Shmem: 9660548 kB' 'KReclaimable: 596900 kB' 'Slab: 1602220 kB' 'SReclaimable: 596900 kB' 'SUnreclaim: 1005320 kB' 'KernelStack: 21824 kB' 'PageTables: 8396 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12216336 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.510 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.511 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41495416 kB' 'MemAvailable: 43144596 kB' 'Buffers: 6816 kB' 'Cached: 11402844 kB' 'SwapCached: 276 kB' 'Active: 8883236 kB' 'Inactive: 3144536 kB' 'Active(anon): 7975380 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621140 kB' 'Mapped: 129900 kB' 'Shmem: 9660580 kB' 'KReclaimable: 596900 kB' 'Slab: 1602192 kB' 'SReclaimable: 596900 kB' 'SUnreclaim: 1005292 kB' 'KernelStack: 21872 kB' 'PageTables: 8452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12216724 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.512 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.513 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.514 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41495416 kB' 'MemAvailable: 43144596 kB' 'Buffers: 6816 kB' 'Cached: 11402848 kB' 'SwapCached: 276 kB' 'Active: 8883636 kB' 'Inactive: 3144536 kB' 'Active(anon): 7975780 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621604 kB' 'Mapped: 129900 kB' 'Shmem: 9660584 kB' 'KReclaimable: 596900 kB' 'Slab: 1602192 kB' 'SReclaimable: 596900 kB' 'SUnreclaim: 1005292 kB' 'KernelStack: 21888 kB' 'PageTables: 8480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12216744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.515 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.516 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:04:29.517 nr_hugepages=1025 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:29.517 resv_hugepages=0 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:29.517 surplus_hugepages=0 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:29.517 anon_hugepages=0 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:29.517 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41496124 kB' 'MemAvailable: 43145304 kB' 'Buffers: 6816 kB' 'Cached: 11402884 kB' 'SwapCached: 276 kB' 'Active: 8883268 kB' 'Inactive: 3144536 kB' 'Active(anon): 7975412 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 621168 kB' 'Mapped: 129900 kB' 'Shmem: 9660620 kB' 'KReclaimable: 596900 kB' 'Slab: 1602192 kB' 'SReclaimable: 596900 kB' 'SUnreclaim: 1005292 kB' 'KernelStack: 21872 kB' 'PageTables: 8424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 12216768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.518 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.519 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23232952 kB' 'MemUsed: 9401484 kB' 'SwapCached: 176 kB' 'Active: 5477064 kB' 'Inactive: 535948 kB' 'Active(anon): 4699124 kB' 'Inactive(anon): 744 kB' 'Active(file): 777940 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5730660 kB' 'Mapped: 73216 kB' 'AnonPages: 285592 kB' 'Shmem: 4417340 kB' 'KernelStack: 11288 kB' 'PageTables: 4572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 410628 kB' 'Slab: 896200 kB' 'SReclaimable: 410628 kB' 'SUnreclaim: 485572 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.520 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.521 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18262920 kB' 'MemUsed: 9386440 kB' 'SwapCached: 100 kB' 'Active: 3407028 kB' 'Inactive: 2608588 kB' 'Active(anon): 3277112 kB' 'Inactive(anon): 2302568 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5679340 kB' 'Mapped: 56684 kB' 'AnonPages: 336384 kB' 'Shmem: 5243304 kB' 'KernelStack: 10616 kB' 'PageTables: 3960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186272 kB' 'Slab: 705992 kB' 'SReclaimable: 186272 kB' 'SUnreclaim: 519720 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.522 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.523 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:04:29.524 node0=513 expecting 513 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:29.524 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:29.525 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:04:29.525 node1=512 expecting 512 00:04:29.525 15:41:37 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:29.525 00:04:29.525 real 0m3.594s 00:04:29.525 user 0m1.347s 00:04:29.525 sys 0m2.296s 00:04:29.525 15:41:37 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:29.525 15:41:37 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:29.525 ************************************ 00:04:29.525 END TEST odd_alloc 00:04:29.525 ************************************ 00:04:29.525 15:41:37 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:04:29.525 15:41:37 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:29.525 15:41:37 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:29.525 15:41:37 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:29.785 ************************************ 00:04:29.785 START TEST custom_alloc 00:04:29.785 ************************************ 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1129 -- # custom_alloc 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:29.785 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:29.786 15:41:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:33.078 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:33.078 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40437492 kB' 'MemAvailable: 42086640 kB' 'Buffers: 6816 kB' 'Cached: 11403000 kB' 'SwapCached: 276 kB' 'Active: 8886092 kB' 'Inactive: 3144536 kB' 'Active(anon): 7978236 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623880 kB' 'Mapped: 129900 kB' 'Shmem: 9660736 kB' 'KReclaimable: 596868 kB' 'Slab: 1602448 kB' 'SReclaimable: 596868 kB' 'SUnreclaim: 1005580 kB' 'KernelStack: 22112 kB' 'PageTables: 8972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12218536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.078 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.079 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40437064 kB' 'MemAvailable: 42086212 kB' 'Buffers: 6816 kB' 'Cached: 11403000 kB' 'SwapCached: 276 kB' 'Active: 8885812 kB' 'Inactive: 3144536 kB' 'Active(anon): 7977956 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623604 kB' 'Mapped: 129856 kB' 'Shmem: 9660736 kB' 'KReclaimable: 596868 kB' 'Slab: 1602428 kB' 'SReclaimable: 596868 kB' 'SUnreclaim: 1005560 kB' 'KernelStack: 22080 kB' 'PageTables: 9100 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12220052 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.080 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.081 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40438616 kB' 'MemAvailable: 42087764 kB' 'Buffers: 6816 kB' 'Cached: 11403020 kB' 'SwapCached: 276 kB' 'Active: 8885068 kB' 'Inactive: 3144536 kB' 'Active(anon): 7977212 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 622844 kB' 'Mapped: 129908 kB' 'Shmem: 9660756 kB' 'KReclaimable: 596868 kB' 'Slab: 1602496 kB' 'SReclaimable: 596868 kB' 'SUnreclaim: 1005628 kB' 'KernelStack: 21872 kB' 'PageTables: 8608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12218576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218148 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.082 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.083 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:04:33.084 nr_hugepages=1536 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:33.084 resv_hugepages=0 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:33.084 surplus_hugepages=0 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:33.084 anon_hugepages=0 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.084 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40438380 kB' 'MemAvailable: 42087528 kB' 'Buffers: 6816 kB' 'Cached: 11403040 kB' 'SwapCached: 276 kB' 'Active: 8885744 kB' 'Inactive: 3144536 kB' 'Active(anon): 7977888 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623532 kB' 'Mapped: 129908 kB' 'Shmem: 9660776 kB' 'KReclaimable: 596868 kB' 'Slab: 1602496 kB' 'SReclaimable: 596868 kB' 'SUnreclaim: 1005628 kB' 'KernelStack: 22048 kB' 'PageTables: 9008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 12220096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.085 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.086 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23220948 kB' 'MemUsed: 9413488 kB' 'SwapCached: 176 kB' 'Active: 5476692 kB' 'Inactive: 535948 kB' 'Active(anon): 4698752 kB' 'Inactive(anon): 744 kB' 'Active(file): 777940 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5730768 kB' 'Mapped: 73224 kB' 'AnonPages: 285080 kB' 'Shmem: 4417448 kB' 'KernelStack: 11224 kB' 'PageTables: 4388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 410596 kB' 'Slab: 896612 kB' 'SReclaimable: 410596 kB' 'SUnreclaim: 486016 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.087 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 17218152 kB' 'MemUsed: 10431208 kB' 'SwapCached: 100 kB' 'Active: 3408888 kB' 'Inactive: 2608588 kB' 'Active(anon): 3278972 kB' 'Inactive(anon): 2302568 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5679396 kB' 'Mapped: 56684 kB' 'AnonPages: 338168 kB' 'Shmem: 5243360 kB' 'KernelStack: 10792 kB' 'PageTables: 4480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 186272 kB' 'Slab: 705884 kB' 'SReclaimable: 186272 kB' 'SUnreclaim: 519612 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.088 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.089 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:04:33.090 node0=512 expecting 512 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:04:33.090 node1=1024 expecting 1024 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:33.090 00:04:33.090 real 0m3.426s 00:04:33.090 user 0m1.263s 00:04:33.090 sys 0m2.154s 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:33.090 15:41:40 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:33.090 ************************************ 00:04:33.090 END TEST custom_alloc 00:04:33.090 ************************************ 00:04:33.090 15:41:40 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:33.090 15:41:40 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:33.090 15:41:40 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:33.090 15:41:40 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:33.090 ************************************ 00:04:33.090 START TEST no_shrink_alloc 00:04:33.090 ************************************ 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1129 -- # no_shrink_alloc 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.090 15:41:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:36.379 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:36.379 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41478828 kB' 'MemAvailable: 43127968 kB' 'Buffers: 6816 kB' 'Cached: 11403172 kB' 'SwapCached: 276 kB' 'Active: 8885752 kB' 'Inactive: 3144536 kB' 'Active(anon): 7977896 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623352 kB' 'Mapped: 130052 kB' 'Shmem: 9660908 kB' 'KReclaimable: 596860 kB' 'Slab: 1602388 kB' 'SReclaimable: 596860 kB' 'SUnreclaim: 1005528 kB' 'KernelStack: 21920 kB' 'PageTables: 8608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12218096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.645 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.646 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41477516 kB' 'MemAvailable: 43126656 kB' 'Buffers: 6816 kB' 'Cached: 11403176 kB' 'SwapCached: 276 kB' 'Active: 8886104 kB' 'Inactive: 3144536 kB' 'Active(anon): 7978248 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623716 kB' 'Mapped: 130052 kB' 'Shmem: 9660912 kB' 'KReclaimable: 596860 kB' 'Slab: 1602368 kB' 'SReclaimable: 596860 kB' 'SUnreclaim: 1005508 kB' 'KernelStack: 21904 kB' 'PageTables: 8552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12218112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.647 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.648 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41479204 kB' 'MemAvailable: 43128344 kB' 'Buffers: 6816 kB' 'Cached: 11403196 kB' 'SwapCached: 276 kB' 'Active: 8885844 kB' 'Inactive: 3144536 kB' 'Active(anon): 7977988 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623384 kB' 'Mapped: 129928 kB' 'Shmem: 9660932 kB' 'KReclaimable: 596860 kB' 'Slab: 1602292 kB' 'SReclaimable: 596860 kB' 'SUnreclaim: 1005432 kB' 'KernelStack: 21872 kB' 'PageTables: 8428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12218136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.649 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.650 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:36.651 nr_hugepages=1024 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:36.651 resv_hugepages=0 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:36.651 surplus_hugepages=0 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:36.651 anon_hugepages=0 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.651 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41479204 kB' 'MemAvailable: 43128344 kB' 'Buffers: 6816 kB' 'Cached: 11403196 kB' 'SwapCached: 276 kB' 'Active: 8885844 kB' 'Inactive: 3144536 kB' 'Active(anon): 7977988 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623384 kB' 'Mapped: 129928 kB' 'Shmem: 9660932 kB' 'KReclaimable: 596860 kB' 'Slab: 1602292 kB' 'SReclaimable: 596860 kB' 'SUnreclaim: 1005432 kB' 'KernelStack: 21872 kB' 'PageTables: 8428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12218156 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.652 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:36.653 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22162656 kB' 'MemUsed: 10471780 kB' 'SwapCached: 176 kB' 'Active: 5475720 kB' 'Inactive: 535948 kB' 'Active(anon): 4697780 kB' 'Inactive(anon): 744 kB' 'Active(file): 777940 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5730904 kB' 'Mapped: 73244 kB' 'AnonPages: 283876 kB' 'Shmem: 4417584 kB' 'KernelStack: 11240 kB' 'PageTables: 4428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 410588 kB' 'Slab: 895776 kB' 'SReclaimable: 410588 kB' 'SUnreclaim: 485188 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.654 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:36.655 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:36.656 node0=1024 expecting 1024 00:04:36.656 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:36.656 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:04:36.656 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:04:36.656 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:04:36.656 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:04:36.656 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.656 15:41:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:39.945 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:39.945 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:39.945 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:39.945 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:39.945 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:39.946 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:39.946 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41491120 kB' 'MemAvailable: 43140260 kB' 'Buffers: 6816 kB' 'Cached: 11403324 kB' 'SwapCached: 276 kB' 'Active: 8886860 kB' 'Inactive: 3144536 kB' 'Active(anon): 7979004 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 624252 kB' 'Mapped: 129972 kB' 'Shmem: 9661060 kB' 'KReclaimable: 596860 kB' 'Slab: 1602296 kB' 'SReclaimable: 596860 kB' 'SUnreclaim: 1005436 kB' 'KernelStack: 21936 kB' 'PageTables: 8624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12218756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.946 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.947 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41499276 kB' 'MemAvailable: 43148408 kB' 'Buffers: 6816 kB' 'Cached: 11403328 kB' 'SwapCached: 276 kB' 'Active: 8886836 kB' 'Inactive: 3144536 kB' 'Active(anon): 7978980 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 624368 kB' 'Mapped: 129936 kB' 'Shmem: 9661064 kB' 'KReclaimable: 596852 kB' 'Slab: 1602264 kB' 'SReclaimable: 596852 kB' 'SUnreclaim: 1005412 kB' 'KernelStack: 21936 kB' 'PageTables: 8632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12218776 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.948 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.949 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.950 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41500312 kB' 'MemAvailable: 43149444 kB' 'Buffers: 6816 kB' 'Cached: 11403344 kB' 'SwapCached: 276 kB' 'Active: 8886444 kB' 'Inactive: 3144536 kB' 'Active(anon): 7978588 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623900 kB' 'Mapped: 129936 kB' 'Shmem: 9661080 kB' 'KReclaimable: 596852 kB' 'Slab: 1602240 kB' 'SReclaimable: 596852 kB' 'SUnreclaim: 1005388 kB' 'KernelStack: 21904 kB' 'PageTables: 8520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12218796 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.951 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.952 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:39.953 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.214 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:40.215 nr_hugepages=1024 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:40.215 resv_hugepages=0 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:40.215 surplus_hugepages=0 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:40.215 anon_hugepages=0 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41500388 kB' 'MemAvailable: 43149520 kB' 'Buffers: 6816 kB' 'Cached: 11403344 kB' 'SwapCached: 276 kB' 'Active: 8886980 kB' 'Inactive: 3144536 kB' 'Active(anon): 7979124 kB' 'Inactive(anon): 2303312 kB' 'Active(file): 907856 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7699452 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 623932 kB' 'Mapped: 129936 kB' 'Shmem: 9661080 kB' 'KReclaimable: 596852 kB' 'Slab: 1602240 kB' 'SReclaimable: 596852 kB' 'SUnreclaim: 1005388 kB' 'KernelStack: 21920 kB' 'PageTables: 8572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 12218820 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 122752 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.215 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.216 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22179392 kB' 'MemUsed: 10455044 kB' 'SwapCached: 176 kB' 'Active: 5476476 kB' 'Inactive: 535948 kB' 'Active(anon): 4698536 kB' 'Inactive(anon): 744 kB' 'Active(file): 777940 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5730948 kB' 'Mapped: 73252 kB' 'AnonPages: 284700 kB' 'Shmem: 4417628 kB' 'KernelStack: 11256 kB' 'PageTables: 4484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 410580 kB' 'Slab: 895664 kB' 'SReclaimable: 410580 kB' 'SUnreclaim: 485084 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:40.217 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.218 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:40.219 node0=1024 expecting 1024 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:40.219 00:04:40.219 real 0m6.996s 00:04:40.219 user 0m2.524s 00:04:40.219 sys 0m4.594s 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:40.219 15:41:47 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:40.219 ************************************ 00:04:40.219 END TEST no_shrink_alloc 00:04:40.219 ************************************ 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:40.219 15:41:48 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:40.219 00:04:40.219 real 0m23.647s 00:04:40.219 user 0m8.249s 00:04:40.219 sys 0m14.188s 00:04:40.219 15:41:48 setup.sh.hugepages -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:40.219 15:41:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:40.219 ************************************ 00:04:40.219 END TEST hugepages 00:04:40.219 ************************************ 00:04:40.219 15:41:48 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:40.219 15:41:48 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:40.219 15:41:48 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:40.219 15:41:48 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:40.219 ************************************ 00:04:40.219 START TEST driver 00:04:40.219 ************************************ 00:04:40.219 15:41:48 setup.sh.driver -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:40.478 * Looking for test storage... 00:04:40.478 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:40.478 15:41:48 setup.sh.driver -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:40.478 15:41:48 setup.sh.driver -- common/autotest_common.sh@1693 -- # lcov --version 00:04:40.478 15:41:48 setup.sh.driver -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:40.478 15:41:48 setup.sh.driver -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:40.478 15:41:48 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:40.479 15:41:48 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:04:40.479 15:41:48 setup.sh.driver -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:40.479 15:41:48 setup.sh.driver -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:40.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.479 --rc genhtml_branch_coverage=1 00:04:40.479 --rc genhtml_function_coverage=1 00:04:40.479 --rc genhtml_legend=1 00:04:40.479 --rc geninfo_all_blocks=1 00:04:40.479 --rc geninfo_unexecuted_blocks=1 00:04:40.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:40.479 ' 00:04:40.479 15:41:48 setup.sh.driver -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:40.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.479 --rc genhtml_branch_coverage=1 00:04:40.479 --rc genhtml_function_coverage=1 00:04:40.479 --rc genhtml_legend=1 00:04:40.479 --rc geninfo_all_blocks=1 00:04:40.479 --rc geninfo_unexecuted_blocks=1 00:04:40.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:40.479 ' 00:04:40.479 15:41:48 setup.sh.driver -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:40.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.479 --rc genhtml_branch_coverage=1 00:04:40.479 --rc genhtml_function_coverage=1 00:04:40.479 --rc genhtml_legend=1 00:04:40.479 --rc geninfo_all_blocks=1 00:04:40.479 --rc geninfo_unexecuted_blocks=1 00:04:40.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:40.479 ' 00:04:40.479 15:41:48 setup.sh.driver -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:40.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.479 --rc genhtml_branch_coverage=1 00:04:40.479 --rc genhtml_function_coverage=1 00:04:40.479 --rc genhtml_legend=1 00:04:40.479 --rc geninfo_all_blocks=1 00:04:40.479 --rc geninfo_unexecuted_blocks=1 00:04:40.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:40.479 ' 00:04:40.479 15:41:48 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:40.479 15:41:48 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:40.479 15:41:48 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:44.675 15:41:52 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:44.675 15:41:52 setup.sh.driver -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:44.675 15:41:52 setup.sh.driver -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:44.675 15:41:52 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:44.675 ************************************ 00:04:44.675 START TEST guess_driver 00:04:44.675 ************************************ 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- common/autotest_common.sh@1129 -- # guess_driver 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:44.675 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:44.675 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:44.675 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:44.675 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:44.675 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:44.675 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:44.675 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:44.675 Looking for driver=vfio-pci 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:44.675 15:41:52 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:44.676 15:41:52 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:44.676 15:41:52 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:47.965 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:55 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:48.224 15:41:56 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.605 15:41:57 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:49.605 15:41:57 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:49.605 15:41:57 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:49.864 15:41:57 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:49.864 15:41:57 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:49.864 15:41:57 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:49.864 15:41:57 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:55.142 00:04:55.142 real 0m9.864s 00:04:55.142 user 0m2.549s 00:04:55.142 sys 0m5.140s 00:04:55.142 15:42:02 setup.sh.driver.guess_driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:55.142 15:42:02 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:55.142 ************************************ 00:04:55.142 END TEST guess_driver 00:04:55.142 ************************************ 00:04:55.142 00:04:55.142 real 0m14.396s 00:04:55.142 user 0m3.715s 00:04:55.142 sys 0m7.658s 00:04:55.142 15:42:02 setup.sh.driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:55.142 15:42:02 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:55.142 ************************************ 00:04:55.142 END TEST driver 00:04:55.142 ************************************ 00:04:55.142 15:42:02 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:55.142 15:42:02 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:55.142 15:42:02 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:55.142 15:42:02 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:55.142 ************************************ 00:04:55.142 START TEST devices 00:04:55.142 ************************************ 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:55.142 * Looking for test storage... 00:04:55.142 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1693 -- # lcov --version 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:55.142 15:42:02 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:55.142 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:55.142 --rc genhtml_branch_coverage=1 00:04:55.142 --rc genhtml_function_coverage=1 00:04:55.142 --rc genhtml_legend=1 00:04:55.142 --rc geninfo_all_blocks=1 00:04:55.142 --rc geninfo_unexecuted_blocks=1 00:04:55.142 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:55.142 ' 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:55.142 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:55.142 --rc genhtml_branch_coverage=1 00:04:55.142 --rc genhtml_function_coverage=1 00:04:55.142 --rc genhtml_legend=1 00:04:55.142 --rc geninfo_all_blocks=1 00:04:55.142 --rc geninfo_unexecuted_blocks=1 00:04:55.142 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:55.142 ' 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:55.142 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:55.142 --rc genhtml_branch_coverage=1 00:04:55.142 --rc genhtml_function_coverage=1 00:04:55.142 --rc genhtml_legend=1 00:04:55.142 --rc geninfo_all_blocks=1 00:04:55.142 --rc geninfo_unexecuted_blocks=1 00:04:55.142 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:55.142 ' 00:04:55.142 15:42:02 setup.sh.devices -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:55.142 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:55.142 --rc genhtml_branch_coverage=1 00:04:55.142 --rc genhtml_function_coverage=1 00:04:55.142 --rc genhtml_legend=1 00:04:55.142 --rc geninfo_all_blocks=1 00:04:55.142 --rc geninfo_unexecuted_blocks=1 00:04:55.142 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:55.142 ' 00:04:55.142 15:42:02 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:55.142 15:42:02 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:55.142 15:42:02 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:55.142 15:42:02 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:59.339 15:42:06 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:04:59.339 15:42:06 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:59.339 No valid GPT data, bailing 00:04:59.339 15:42:06 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:59.339 15:42:06 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:04:59.339 15:42:06 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:59.339 15:42:06 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:59.339 15:42:06 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:59.339 15:42:06 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:59.339 15:42:06 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:59.339 15:42:06 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:59.339 ************************************ 00:04:59.339 START TEST nvme_mount 00:04:59.339 ************************************ 00:04:59.339 15:42:06 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1129 -- # nvme_mount 00:04:59.339 15:42:06 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:59.339 15:42:06 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:59.339 15:42:06 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:59.339 15:42:06 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:59.339 15:42:06 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:59.340 15:42:06 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:59.909 Creating new GPT entries in memory. 00:04:59.909 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:59.909 other utilities. 00:04:59.909 15:42:07 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:59.909 15:42:07 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:59.909 15:42:07 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:59.909 15:42:07 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:59.909 15:42:07 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:01.080 Creating new GPT entries in memory. 00:05:01.080 The operation has completed successfully. 00:05:01.080 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1678680 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:01.081 15:42:08 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.372 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.373 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.373 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.373 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:04.373 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:04.373 15:42:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:04.373 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:04.373 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:04.632 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:04.632 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:04.632 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:04.632 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.632 15:42:12 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:07.923 15:42:15 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:11.220 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:11.220 00:05:11.220 real 0m12.155s 00:05:11.220 user 0m3.403s 00:05:11.220 sys 0m6.561s 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:11.220 15:42:18 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:11.220 ************************************ 00:05:11.220 END TEST nvme_mount 00:05:11.220 ************************************ 00:05:11.220 15:42:18 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:11.220 15:42:18 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:11.220 15:42:18 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:11.220 15:42:18 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:11.220 ************************************ 00:05:11.220 START TEST dm_mount 00:05:11.220 ************************************ 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- common/autotest_common.sh@1129 -- # dm_mount 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:11.220 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:11.221 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:11.221 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:11.221 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:11.221 15:42:18 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:12.159 Creating new GPT entries in memory. 00:05:12.159 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:12.159 other utilities. 00:05:12.159 15:42:19 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:12.159 15:42:19 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:12.159 15:42:19 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:12.159 15:42:19 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:12.159 15:42:19 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:13.096 Creating new GPT entries in memory. 00:05:13.096 The operation has completed successfully. 00:05:13.096 15:42:20 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:13.096 15:42:20 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:13.096 15:42:20 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:13.096 15:42:20 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:13.096 15:42:20 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:14.087 The operation has completed successfully. 00:05:14.087 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:14.087 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:14.087 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1683106 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.346 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:14.347 15:42:22 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:14.347 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.347 15:42:22 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:17.642 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:17.643 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:17.643 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.904 15:42:25 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:20.442 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.443 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:20.702 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:20.702 15:42:28 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:20.961 00:05:20.961 real 0m9.725s 00:05:20.961 user 0m2.160s 00:05:20.961 sys 0m4.598s 00:05:20.961 15:42:28 setup.sh.devices.dm_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:20.961 15:42:28 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:20.961 ************************************ 00:05:20.961 END TEST dm_mount 00:05:20.961 ************************************ 00:05:20.961 15:42:28 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:20.961 15:42:28 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:20.961 15:42:28 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:20.961 15:42:28 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:20.961 15:42:28 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:20.961 15:42:28 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:20.961 15:42:28 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:21.220 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:21.220 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:21.220 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:21.220 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:21.220 15:42:29 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:21.220 15:42:29 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:21.220 15:42:29 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:21.220 15:42:29 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:21.220 15:42:29 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:21.220 15:42:29 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:21.220 15:42:29 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:21.220 00:05:21.220 real 0m26.435s 00:05:21.220 user 0m7.062s 00:05:21.220 sys 0m14.134s 00:05:21.220 15:42:29 setup.sh.devices -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:21.220 15:42:29 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:21.220 ************************************ 00:05:21.220 END TEST devices 00:05:21.220 ************************************ 00:05:21.220 00:05:21.220 real 1m28.888s 00:05:21.220 user 0m26.456s 00:05:21.220 sys 0m51.024s 00:05:21.220 15:42:29 setup.sh -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:21.220 15:42:29 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:21.220 ************************************ 00:05:21.220 END TEST setup.sh 00:05:21.220 ************************************ 00:05:21.220 15:42:29 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:24.512 Hugepages 00:05:24.512 node hugesize free / total 00:05:24.512 node0 1048576kB 0 / 0 00:05:24.512 node0 2048kB 1024 / 1024 00:05:24.512 node1 1048576kB 0 / 0 00:05:24.512 node1 2048kB 1024 / 1024 00:05:24.512 00:05:24.512 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:24.512 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:24.512 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:24.512 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:24.512 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:24.771 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:24.771 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:24.771 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:24.771 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:24.771 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:24.771 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:24.771 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:24.771 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:24.771 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:24.771 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:24.771 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:24.771 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:24.771 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:24.771 15:42:32 -- spdk/autotest.sh@117 -- # uname -s 00:05:24.771 15:42:32 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:05:24.771 15:42:32 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:05:24.771 15:42:32 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:28.062 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:28.062 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:28.062 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:28.062 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:28.062 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:28.062 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:28.062 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:28.062 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:28.321 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:28.321 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:28.321 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:28.321 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:28.321 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:28.321 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:28.321 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:28.321 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:30.226 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:30.226 15:42:37 -- common/autotest_common.sh@1517 -- # sleep 1 00:05:31.164 15:42:38 -- common/autotest_common.sh@1518 -- # bdfs=() 00:05:31.164 15:42:38 -- common/autotest_common.sh@1518 -- # local bdfs 00:05:31.164 15:42:38 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:05:31.164 15:42:38 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:05:31.164 15:42:38 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:31.164 15:42:38 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:31.164 15:42:38 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:31.164 15:42:38 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:31.164 15:42:38 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:31.164 15:42:38 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:31.164 15:42:38 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:31.164 15:42:38 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:34.453 Waiting for block devices as requested 00:05:34.453 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:34.453 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:34.453 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:34.713 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:34.713 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:34.713 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:34.973 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:34.973 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:34.973 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:35.232 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:35.232 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:35.232 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:35.491 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:35.491 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:35.491 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:35.750 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:35.750 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:36.009 15:42:43 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:05:36.009 15:42:43 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:36.009 15:42:43 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:05:36.009 15:42:43 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:05:36.009 15:42:43 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:36.009 15:42:43 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:36.009 15:42:43 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:36.009 15:42:43 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:05:36.009 15:42:43 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:05:36.009 15:42:43 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:05:36.009 15:42:43 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:05:36.009 15:42:43 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:05:36.009 15:42:43 -- common/autotest_common.sh@1531 -- # grep oacs 00:05:36.009 15:42:43 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:05:36.009 15:42:43 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:05:36.009 15:42:43 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:05:36.009 15:42:43 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:36.009 15:42:43 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:36.009 15:42:43 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:05:36.009 15:42:43 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:05:36.009 15:42:43 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:05:36.009 15:42:43 -- common/autotest_common.sh@1543 -- # continue 00:05:36.009 15:42:43 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:05:36.009 15:42:43 -- common/autotest_common.sh@732 -- # xtrace_disable 00:05:36.009 15:42:43 -- common/autotest_common.sh@10 -- # set +x 00:05:36.009 15:42:43 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:05:36.009 15:42:43 -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:36.010 15:42:43 -- common/autotest_common.sh@10 -- # set +x 00:05:36.010 15:42:43 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:39.300 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:39.300 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:39.300 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:39.300 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:39.300 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:39.560 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:41.466 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:41.466 15:42:49 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:05:41.466 15:42:49 -- common/autotest_common.sh@732 -- # xtrace_disable 00:05:41.466 15:42:49 -- common/autotest_common.sh@10 -- # set +x 00:05:41.466 15:42:49 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:05:41.466 15:42:49 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:05:41.466 15:42:49 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:05:41.466 15:42:49 -- common/autotest_common.sh@1563 -- # bdfs=() 00:05:41.466 15:42:49 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:05:41.466 15:42:49 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:05:41.467 15:42:49 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:05:41.467 15:42:49 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:05:41.467 15:42:49 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:41.467 15:42:49 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:41.467 15:42:49 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:41.467 15:42:49 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:41.467 15:42:49 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:41.467 15:42:49 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:41.467 15:42:49 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:41.467 15:42:49 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:05:41.467 15:42:49 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:41.467 15:42:49 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:05:41.467 15:42:49 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:41.467 15:42:49 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:05:41.467 15:42:49 -- common/autotest_common.sh@1572 -- # (( 1 > 0 )) 00:05:41.467 15:42:49 -- common/autotest_common.sh@1573 -- # printf '%s\n' 0000:d8:00.0 00:05:41.467 15:42:49 -- common/autotest_common.sh@1579 -- # [[ -z 0000:d8:00.0 ]] 00:05:41.467 15:42:49 -- common/autotest_common.sh@1584 -- # spdk_tgt_pid=1692901 00:05:41.467 15:42:49 -- common/autotest_common.sh@1585 -- # waitforlisten 1692901 00:05:41.467 15:42:49 -- common/autotest_common.sh@1583 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:41.467 15:42:49 -- common/autotest_common.sh@835 -- # '[' -z 1692901 ']' 00:05:41.467 15:42:49 -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:41.467 15:42:49 -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:41.467 15:42:49 -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:41.467 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:41.467 15:42:49 -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:41.467 15:42:49 -- common/autotest_common.sh@10 -- # set +x 00:05:41.467 [2024-11-30 15:42:49.275151] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:05:41.467 [2024-11-30 15:42:49.275213] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1692901 ] 00:05:41.467 [2024-11-30 15:42:49.411100] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:05:41.725 [2024-11-30 15:42:49.446494] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.725 [2024-11-30 15:42:49.469108] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.293 15:42:50 -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:42.293 15:42:50 -- common/autotest_common.sh@868 -- # return 0 00:05:42.293 15:42:50 -- common/autotest_common.sh@1587 -- # bdf_id=0 00:05:42.293 15:42:50 -- common/autotest_common.sh@1588 -- # for bdf in "${bdfs[@]}" 00:05:42.293 15:42:50 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:45.582 nvme0n1 00:05:45.582 15:42:53 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:45.582 [2024-11-30 15:42:53.314270] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:45.582 request: 00:05:45.582 { 00:05:45.582 "nvme_ctrlr_name": "nvme0", 00:05:45.582 "password": "test", 00:05:45.582 "method": "bdev_nvme_opal_revert", 00:05:45.582 "req_id": 1 00:05:45.582 } 00:05:45.582 Got JSON-RPC error response 00:05:45.582 response: 00:05:45.582 { 00:05:45.582 "code": -32602, 00:05:45.582 "message": "Invalid parameters" 00:05:45.582 } 00:05:45.582 15:42:53 -- common/autotest_common.sh@1591 -- # true 00:05:45.582 15:42:53 -- common/autotest_common.sh@1592 -- # (( ++bdf_id )) 00:05:45.582 15:42:53 -- common/autotest_common.sh@1595 -- # killprocess 1692901 00:05:45.582 15:42:53 -- common/autotest_common.sh@954 -- # '[' -z 1692901 ']' 00:05:45.582 15:42:53 -- common/autotest_common.sh@958 -- # kill -0 1692901 00:05:45.582 15:42:53 -- common/autotest_common.sh@959 -- # uname 00:05:45.582 15:42:53 -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:45.583 15:42:53 -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1692901 00:05:45.583 15:42:53 -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:45.583 15:42:53 -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:45.583 15:42:53 -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1692901' 00:05:45.583 killing process with pid 1692901 00:05:45.583 15:42:53 -- common/autotest_common.sh@973 -- # kill 1692901 00:05:45.583 15:42:53 -- common/autotest_common.sh@978 -- # wait 1692901 00:05:48.123 15:42:55 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:05:48.123 15:42:55 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:05:48.123 15:42:55 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:05:48.123 15:42:55 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:05:48.123 15:42:55 -- spdk/autotest.sh@149 -- # timing_enter lib 00:05:48.123 15:42:55 -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:48.123 15:42:55 -- common/autotest_common.sh@10 -- # set +x 00:05:48.123 15:42:55 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:05:48.123 15:42:55 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:48.123 15:42:55 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:48.123 15:42:55 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:48.123 15:42:55 -- common/autotest_common.sh@10 -- # set +x 00:05:48.123 ************************************ 00:05:48.123 START TEST env 00:05:48.123 ************************************ 00:05:48.123 15:42:55 env -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:48.123 * Looking for test storage... 00:05:48.123 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:48.123 15:42:55 env -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:48.123 15:42:55 env -- common/autotest_common.sh@1693 -- # lcov --version 00:05:48.123 15:42:55 env -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:48.123 15:42:55 env -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:48.123 15:42:55 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:48.123 15:42:55 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:48.123 15:42:55 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:48.124 15:42:55 env -- scripts/common.sh@336 -- # IFS=.-: 00:05:48.124 15:42:55 env -- scripts/common.sh@336 -- # read -ra ver1 00:05:48.124 15:42:55 env -- scripts/common.sh@337 -- # IFS=.-: 00:05:48.124 15:42:55 env -- scripts/common.sh@337 -- # read -ra ver2 00:05:48.124 15:42:55 env -- scripts/common.sh@338 -- # local 'op=<' 00:05:48.124 15:42:55 env -- scripts/common.sh@340 -- # ver1_l=2 00:05:48.124 15:42:55 env -- scripts/common.sh@341 -- # ver2_l=1 00:05:48.124 15:42:55 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:48.124 15:42:55 env -- scripts/common.sh@344 -- # case "$op" in 00:05:48.124 15:42:55 env -- scripts/common.sh@345 -- # : 1 00:05:48.124 15:42:55 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:48.124 15:42:55 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:48.124 15:42:55 env -- scripts/common.sh@365 -- # decimal 1 00:05:48.124 15:42:55 env -- scripts/common.sh@353 -- # local d=1 00:05:48.124 15:42:55 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:48.124 15:42:55 env -- scripts/common.sh@355 -- # echo 1 00:05:48.124 15:42:55 env -- scripts/common.sh@365 -- # ver1[v]=1 00:05:48.124 15:42:55 env -- scripts/common.sh@366 -- # decimal 2 00:05:48.124 15:42:55 env -- scripts/common.sh@353 -- # local d=2 00:05:48.124 15:42:55 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:48.124 15:42:55 env -- scripts/common.sh@355 -- # echo 2 00:05:48.124 15:42:55 env -- scripts/common.sh@366 -- # ver2[v]=2 00:05:48.124 15:42:55 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:48.124 15:42:55 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:48.124 15:42:55 env -- scripts/common.sh@368 -- # return 0 00:05:48.124 15:42:55 env -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:48.124 15:42:55 env -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:48.124 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.124 --rc genhtml_branch_coverage=1 00:05:48.124 --rc genhtml_function_coverage=1 00:05:48.124 --rc genhtml_legend=1 00:05:48.124 --rc geninfo_all_blocks=1 00:05:48.124 --rc geninfo_unexecuted_blocks=1 00:05:48.124 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:48.124 ' 00:05:48.124 15:42:55 env -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:48.124 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.124 --rc genhtml_branch_coverage=1 00:05:48.124 --rc genhtml_function_coverage=1 00:05:48.124 --rc genhtml_legend=1 00:05:48.124 --rc geninfo_all_blocks=1 00:05:48.124 --rc geninfo_unexecuted_blocks=1 00:05:48.124 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:48.124 ' 00:05:48.124 15:42:55 env -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:48.124 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.124 --rc genhtml_branch_coverage=1 00:05:48.124 --rc genhtml_function_coverage=1 00:05:48.124 --rc genhtml_legend=1 00:05:48.124 --rc geninfo_all_blocks=1 00:05:48.124 --rc geninfo_unexecuted_blocks=1 00:05:48.124 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:48.124 ' 00:05:48.124 15:42:55 env -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:48.124 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.124 --rc genhtml_branch_coverage=1 00:05:48.124 --rc genhtml_function_coverage=1 00:05:48.124 --rc genhtml_legend=1 00:05:48.124 --rc geninfo_all_blocks=1 00:05:48.124 --rc geninfo_unexecuted_blocks=1 00:05:48.124 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:48.124 ' 00:05:48.124 15:42:55 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:48.124 15:42:55 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:48.124 15:42:55 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:48.124 15:42:55 env -- common/autotest_common.sh@10 -- # set +x 00:05:48.124 ************************************ 00:05:48.124 START TEST env_memory 00:05:48.124 ************************************ 00:05:48.124 15:42:55 env.env_memory -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:48.124 00:05:48.124 00:05:48.124 CUnit - A unit testing framework for C - Version 2.1-3 00:05:48.124 http://cunit.sourceforge.net/ 00:05:48.124 00:05:48.124 00:05:48.124 Suite: memory 00:05:48.124 Test: alloc and free memory map ...[2024-11-30 15:42:55.896545] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:48.124 passed 00:05:48.124 Test: mem map translation ...[2024-11-30 15:42:55.909811] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:48.124 [2024-11-30 15:42:55.909828] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:48.124 [2024-11-30 15:42:55.909857] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:48.124 [2024-11-30 15:42:55.909866] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:48.124 passed 00:05:48.124 Test: mem map registration ...[2024-11-30 15:42:55.929900] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:05:48.124 [2024-11-30 15:42:55.929917] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:05:48.124 passed 00:05:48.124 Test: mem map adjacent registrations ...passed 00:05:48.124 00:05:48.124 Run Summary: Type Total Ran Passed Failed Inactive 00:05:48.124 suites 1 1 n/a 0 0 00:05:48.124 tests 4 4 4 0 0 00:05:48.124 asserts 152 152 152 0 n/a 00:05:48.124 00:05:48.124 Elapsed time = 0.085 seconds 00:05:48.124 00:05:48.124 real 0m0.099s 00:05:48.124 user 0m0.084s 00:05:48.124 sys 0m0.015s 00:05:48.124 15:42:55 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:48.124 15:42:55 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:48.124 ************************************ 00:05:48.124 END TEST env_memory 00:05:48.124 ************************************ 00:05:48.124 15:42:56 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:48.124 15:42:56 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:48.124 15:42:56 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:48.124 15:42:56 env -- common/autotest_common.sh@10 -- # set +x 00:05:48.124 ************************************ 00:05:48.124 START TEST env_vtophys 00:05:48.124 ************************************ 00:05:48.124 15:42:56 env.env_vtophys -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:48.124 EAL: lib.eal log level changed from notice to debug 00:05:48.124 EAL: Detected lcore 0 as core 0 on socket 0 00:05:48.124 EAL: Detected lcore 1 as core 1 on socket 0 00:05:48.124 EAL: Detected lcore 2 as core 2 on socket 0 00:05:48.124 EAL: Detected lcore 3 as core 3 on socket 0 00:05:48.124 EAL: Detected lcore 4 as core 4 on socket 0 00:05:48.124 EAL: Detected lcore 5 as core 5 on socket 0 00:05:48.124 EAL: Detected lcore 6 as core 6 on socket 0 00:05:48.124 EAL: Detected lcore 7 as core 8 on socket 0 00:05:48.124 EAL: Detected lcore 8 as core 9 on socket 0 00:05:48.124 EAL: Detected lcore 9 as core 10 on socket 0 00:05:48.124 EAL: Detected lcore 10 as core 11 on socket 0 00:05:48.124 EAL: Detected lcore 11 as core 12 on socket 0 00:05:48.124 EAL: Detected lcore 12 as core 13 on socket 0 00:05:48.124 EAL: Detected lcore 13 as core 14 on socket 0 00:05:48.124 EAL: Detected lcore 14 as core 16 on socket 0 00:05:48.124 EAL: Detected lcore 15 as core 17 on socket 0 00:05:48.124 EAL: Detected lcore 16 as core 18 on socket 0 00:05:48.124 EAL: Detected lcore 17 as core 19 on socket 0 00:05:48.124 EAL: Detected lcore 18 as core 20 on socket 0 00:05:48.124 EAL: Detected lcore 19 as core 21 on socket 0 00:05:48.124 EAL: Detected lcore 20 as core 22 on socket 0 00:05:48.124 EAL: Detected lcore 21 as core 24 on socket 0 00:05:48.124 EAL: Detected lcore 22 as core 25 on socket 0 00:05:48.124 EAL: Detected lcore 23 as core 26 on socket 0 00:05:48.124 EAL: Detected lcore 24 as core 27 on socket 0 00:05:48.124 EAL: Detected lcore 25 as core 28 on socket 0 00:05:48.124 EAL: Detected lcore 26 as core 29 on socket 0 00:05:48.124 EAL: Detected lcore 27 as core 30 on socket 0 00:05:48.124 EAL: Detected lcore 28 as core 0 on socket 1 00:05:48.124 EAL: Detected lcore 29 as core 1 on socket 1 00:05:48.124 EAL: Detected lcore 30 as core 2 on socket 1 00:05:48.124 EAL: Detected lcore 31 as core 3 on socket 1 00:05:48.124 EAL: Detected lcore 32 as core 4 on socket 1 00:05:48.124 EAL: Detected lcore 33 as core 5 on socket 1 00:05:48.124 EAL: Detected lcore 34 as core 6 on socket 1 00:05:48.124 EAL: Detected lcore 35 as core 8 on socket 1 00:05:48.124 EAL: Detected lcore 36 as core 9 on socket 1 00:05:48.124 EAL: Detected lcore 37 as core 10 on socket 1 00:05:48.124 EAL: Detected lcore 38 as core 11 on socket 1 00:05:48.124 EAL: Detected lcore 39 as core 12 on socket 1 00:05:48.124 EAL: Detected lcore 40 as core 13 on socket 1 00:05:48.125 EAL: Detected lcore 41 as core 14 on socket 1 00:05:48.125 EAL: Detected lcore 42 as core 16 on socket 1 00:05:48.125 EAL: Detected lcore 43 as core 17 on socket 1 00:05:48.125 EAL: Detected lcore 44 as core 18 on socket 1 00:05:48.125 EAL: Detected lcore 45 as core 19 on socket 1 00:05:48.125 EAL: Detected lcore 46 as core 20 on socket 1 00:05:48.125 EAL: Detected lcore 47 as core 21 on socket 1 00:05:48.125 EAL: Detected lcore 48 as core 22 on socket 1 00:05:48.125 EAL: Detected lcore 49 as core 24 on socket 1 00:05:48.125 EAL: Detected lcore 50 as core 25 on socket 1 00:05:48.125 EAL: Detected lcore 51 as core 26 on socket 1 00:05:48.125 EAL: Detected lcore 52 as core 27 on socket 1 00:05:48.125 EAL: Detected lcore 53 as core 28 on socket 1 00:05:48.125 EAL: Detected lcore 54 as core 29 on socket 1 00:05:48.125 EAL: Detected lcore 55 as core 30 on socket 1 00:05:48.125 EAL: Detected lcore 56 as core 0 on socket 0 00:05:48.125 EAL: Detected lcore 57 as core 1 on socket 0 00:05:48.125 EAL: Detected lcore 58 as core 2 on socket 0 00:05:48.125 EAL: Detected lcore 59 as core 3 on socket 0 00:05:48.125 EAL: Detected lcore 60 as core 4 on socket 0 00:05:48.125 EAL: Detected lcore 61 as core 5 on socket 0 00:05:48.125 EAL: Detected lcore 62 as core 6 on socket 0 00:05:48.125 EAL: Detected lcore 63 as core 8 on socket 0 00:05:48.125 EAL: Detected lcore 64 as core 9 on socket 0 00:05:48.125 EAL: Detected lcore 65 as core 10 on socket 0 00:05:48.125 EAL: Detected lcore 66 as core 11 on socket 0 00:05:48.125 EAL: Detected lcore 67 as core 12 on socket 0 00:05:48.125 EAL: Detected lcore 68 as core 13 on socket 0 00:05:48.125 EAL: Detected lcore 69 as core 14 on socket 0 00:05:48.125 EAL: Detected lcore 70 as core 16 on socket 0 00:05:48.125 EAL: Detected lcore 71 as core 17 on socket 0 00:05:48.125 EAL: Detected lcore 72 as core 18 on socket 0 00:05:48.125 EAL: Detected lcore 73 as core 19 on socket 0 00:05:48.125 EAL: Detected lcore 74 as core 20 on socket 0 00:05:48.125 EAL: Detected lcore 75 as core 21 on socket 0 00:05:48.125 EAL: Detected lcore 76 as core 22 on socket 0 00:05:48.125 EAL: Detected lcore 77 as core 24 on socket 0 00:05:48.125 EAL: Detected lcore 78 as core 25 on socket 0 00:05:48.125 EAL: Detected lcore 79 as core 26 on socket 0 00:05:48.125 EAL: Detected lcore 80 as core 27 on socket 0 00:05:48.125 EAL: Detected lcore 81 as core 28 on socket 0 00:05:48.125 EAL: Detected lcore 82 as core 29 on socket 0 00:05:48.125 EAL: Detected lcore 83 as core 30 on socket 0 00:05:48.125 EAL: Detected lcore 84 as core 0 on socket 1 00:05:48.125 EAL: Detected lcore 85 as core 1 on socket 1 00:05:48.125 EAL: Detected lcore 86 as core 2 on socket 1 00:05:48.125 EAL: Detected lcore 87 as core 3 on socket 1 00:05:48.125 EAL: Detected lcore 88 as core 4 on socket 1 00:05:48.125 EAL: Detected lcore 89 as core 5 on socket 1 00:05:48.125 EAL: Detected lcore 90 as core 6 on socket 1 00:05:48.125 EAL: Detected lcore 91 as core 8 on socket 1 00:05:48.125 EAL: Detected lcore 92 as core 9 on socket 1 00:05:48.125 EAL: Detected lcore 93 as core 10 on socket 1 00:05:48.125 EAL: Detected lcore 94 as core 11 on socket 1 00:05:48.125 EAL: Detected lcore 95 as core 12 on socket 1 00:05:48.125 EAL: Detected lcore 96 as core 13 on socket 1 00:05:48.125 EAL: Detected lcore 97 as core 14 on socket 1 00:05:48.125 EAL: Detected lcore 98 as core 16 on socket 1 00:05:48.125 EAL: Detected lcore 99 as core 17 on socket 1 00:05:48.125 EAL: Detected lcore 100 as core 18 on socket 1 00:05:48.125 EAL: Detected lcore 101 as core 19 on socket 1 00:05:48.125 EAL: Detected lcore 102 as core 20 on socket 1 00:05:48.125 EAL: Detected lcore 103 as core 21 on socket 1 00:05:48.125 EAL: Detected lcore 104 as core 22 on socket 1 00:05:48.125 EAL: Detected lcore 105 as core 24 on socket 1 00:05:48.125 EAL: Detected lcore 106 as core 25 on socket 1 00:05:48.125 EAL: Detected lcore 107 as core 26 on socket 1 00:05:48.125 EAL: Detected lcore 108 as core 27 on socket 1 00:05:48.125 EAL: Detected lcore 109 as core 28 on socket 1 00:05:48.125 EAL: Detected lcore 110 as core 29 on socket 1 00:05:48.125 EAL: Detected lcore 111 as core 30 on socket 1 00:05:48.125 EAL: Maximum logical cores by configuration: 128 00:05:48.125 EAL: Detected CPU lcores: 112 00:05:48.125 EAL: Detected NUMA nodes: 2 00:05:48.125 EAL: Checking presence of .so 'librte_eal.so.25.0' 00:05:48.125 EAL: Checking presence of .so 'librte_eal.so.25' 00:05:48.125 EAL: Checking presence of .so 'librte_eal.so' 00:05:48.125 EAL: Detected static linkage of DPDK 00:05:48.125 EAL: No shared files mode enabled, IPC will be disabled 00:05:48.384 EAL: Bus pci wants IOVA as 'DC' 00:05:48.384 EAL: Buses did not request a specific IOVA mode. 00:05:48.384 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:48.384 EAL: Selected IOVA mode 'VA' 00:05:48.384 EAL: Probing VFIO support... 00:05:48.384 EAL: No shared files mode enabled, IPC is disabled 00:05:48.384 EAL: IOMMU type 1 (Type 1) is supported 00:05:48.384 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:48.384 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:48.384 EAL: VFIO support initialized 00:05:48.384 EAL: Ask a virtual area of 0x2e000 bytes 00:05:48.384 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:48.385 EAL: Setting up physically contiguous memory... 00:05:48.385 EAL: Setting maximum number of open files to 524288 00:05:48.385 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:48.385 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:48.385 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:48.385 EAL: Ask a virtual area of 0x61000 bytes 00:05:48.385 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:48.385 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:48.385 EAL: Ask a virtual area of 0x400000000 bytes 00:05:48.385 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:48.385 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:48.385 EAL: Ask a virtual area of 0x61000 bytes 00:05:48.385 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:48.385 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:48.385 EAL: Ask a virtual area of 0x400000000 bytes 00:05:48.385 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:48.385 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:48.385 EAL: Ask a virtual area of 0x61000 bytes 00:05:48.385 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:48.385 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:48.385 EAL: Ask a virtual area of 0x400000000 bytes 00:05:48.385 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:48.385 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:48.385 EAL: Ask a virtual area of 0x61000 bytes 00:05:48.385 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:48.385 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:48.385 EAL: Ask a virtual area of 0x400000000 bytes 00:05:48.385 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:48.385 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:48.385 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:48.385 EAL: Ask a virtual area of 0x61000 bytes 00:05:48.385 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:48.385 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:48.385 EAL: Ask a virtual area of 0x400000000 bytes 00:05:48.385 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:48.385 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:48.385 EAL: Ask a virtual area of 0x61000 bytes 00:05:48.385 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:48.385 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:48.385 EAL: Ask a virtual area of 0x400000000 bytes 00:05:48.385 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:48.385 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:48.385 EAL: Ask a virtual area of 0x61000 bytes 00:05:48.385 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:48.385 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:48.385 EAL: Ask a virtual area of 0x400000000 bytes 00:05:48.385 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:48.385 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:48.385 EAL: Ask a virtual area of 0x61000 bytes 00:05:48.385 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:48.385 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:48.385 EAL: Ask a virtual area of 0x400000000 bytes 00:05:48.385 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:48.385 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:48.385 EAL: Hugepages will be freed exactly as allocated. 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Refined arch frequency 2500000000 to measured frequency 2494135104 00:05:48.385 EAL: TSC frequency is ~2494100 KHz 00:05:48.385 EAL: Main lcore 0 is ready (tid=7f4a61024a00;cpuset=[0]) 00:05:48.385 EAL: Trying to obtain current memory policy. 00:05:48.385 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.385 EAL: Restoring previous memory policy: 0 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was expanded by 2MB 00:05:48.385 EAL: Allocated 2112 bytes of per-lcore data with a 64-byte alignment 00:05:48.385 EAL: Mem event callback 'spdk:(nil)' registered 00:05:48.385 00:05:48.385 00:05:48.385 CUnit - A unit testing framework for C - Version 2.1-3 00:05:48.385 http://cunit.sourceforge.net/ 00:05:48.385 00:05:48.385 00:05:48.385 Suite: components_suite 00:05:48.385 Test: vtophys_malloc_test ...passed 00:05:48.385 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:48.385 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.385 EAL: Restoring previous memory policy: 4 00:05:48.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was expanded by 4MB 00:05:48.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was shrunk by 4MB 00:05:48.385 EAL: Trying to obtain current memory policy. 00:05:48.385 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.385 EAL: Restoring previous memory policy: 4 00:05:48.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was expanded by 6MB 00:05:48.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was shrunk by 6MB 00:05:48.385 EAL: Trying to obtain current memory policy. 00:05:48.385 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.385 EAL: Restoring previous memory policy: 4 00:05:48.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was expanded by 10MB 00:05:48.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was shrunk by 10MB 00:05:48.385 EAL: Trying to obtain current memory policy. 00:05:48.385 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.385 EAL: Restoring previous memory policy: 4 00:05:48.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was expanded by 18MB 00:05:48.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.385 EAL: request: mp_malloc_sync 00:05:48.385 EAL: No shared files mode enabled, IPC is disabled 00:05:48.385 EAL: Heap on socket 0 was shrunk by 18MB 00:05:48.385 EAL: Trying to obtain current memory policy. 00:05:48.385 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.386 EAL: Restoring previous memory policy: 4 00:05:48.386 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.386 EAL: request: mp_malloc_sync 00:05:48.386 EAL: No shared files mode enabled, IPC is disabled 00:05:48.386 EAL: Heap on socket 0 was expanded by 34MB 00:05:48.386 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.386 EAL: request: mp_malloc_sync 00:05:48.386 EAL: No shared files mode enabled, IPC is disabled 00:05:48.386 EAL: Heap on socket 0 was shrunk by 34MB 00:05:48.386 EAL: Trying to obtain current memory policy. 00:05:48.386 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.386 EAL: Restoring previous memory policy: 4 00:05:48.386 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.386 EAL: request: mp_malloc_sync 00:05:48.386 EAL: No shared files mode enabled, IPC is disabled 00:05:48.386 EAL: Heap on socket 0 was expanded by 66MB 00:05:48.386 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.386 EAL: request: mp_malloc_sync 00:05:48.386 EAL: No shared files mode enabled, IPC is disabled 00:05:48.386 EAL: Heap on socket 0 was shrunk by 66MB 00:05:48.386 EAL: Trying to obtain current memory policy. 00:05:48.386 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.386 EAL: Restoring previous memory policy: 4 00:05:48.386 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.386 EAL: request: mp_malloc_sync 00:05:48.386 EAL: No shared files mode enabled, IPC is disabled 00:05:48.386 EAL: Heap on socket 0 was expanded by 130MB 00:05:48.386 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.386 EAL: request: mp_malloc_sync 00:05:48.386 EAL: No shared files mode enabled, IPC is disabled 00:05:48.386 EAL: Heap on socket 0 was shrunk by 130MB 00:05:48.386 EAL: Trying to obtain current memory policy. 00:05:48.386 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.645 EAL: Restoring previous memory policy: 4 00:05:48.645 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.645 EAL: request: mp_malloc_sync 00:05:48.645 EAL: No shared files mode enabled, IPC is disabled 00:05:48.645 EAL: Heap on socket 0 was expanded by 258MB 00:05:48.645 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.645 EAL: request: mp_malloc_sync 00:05:48.645 EAL: No shared files mode enabled, IPC is disabled 00:05:48.645 EAL: Heap on socket 0 was shrunk by 258MB 00:05:48.645 EAL: Trying to obtain current memory policy. 00:05:48.645 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:48.645 EAL: Restoring previous memory policy: 4 00:05:48.645 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.645 EAL: request: mp_malloc_sync 00:05:48.645 EAL: No shared files mode enabled, IPC is disabled 00:05:48.645 EAL: Heap on socket 0 was expanded by 514MB 00:05:48.903 EAL: Calling mem event callback 'spdk:(nil)' 00:05:48.903 EAL: request: mp_malloc_sync 00:05:48.903 EAL: No shared files mode enabled, IPC is disabled 00:05:48.904 EAL: Heap on socket 0 was shrunk by 514MB 00:05:48.904 EAL: Trying to obtain current memory policy. 00:05:48.904 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:49.163 EAL: Restoring previous memory policy: 4 00:05:49.163 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.163 EAL: request: mp_malloc_sync 00:05:49.163 EAL: No shared files mode enabled, IPC is disabled 00:05:49.163 EAL: Heap on socket 0 was expanded by 1026MB 00:05:49.163 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.422 EAL: request: mp_malloc_sync 00:05:49.422 EAL: No shared files mode enabled, IPC is disabled 00:05:49.422 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:49.422 passed 00:05:49.422 00:05:49.422 Run Summary: Type Total Ran Passed Failed Inactive 00:05:49.422 suites 1 1 n/a 0 0 00:05:49.422 tests 2 2 2 0 0 00:05:49.422 asserts 497 497 497 0 n/a 00:05:49.422 00:05:49.422 Elapsed time = 0.961 seconds 00:05:49.422 EAL: Calling mem event callback 'spdk:(nil)' 00:05:49.422 EAL: request: mp_malloc_sync 00:05:49.422 EAL: No shared files mode enabled, IPC is disabled 00:05:49.422 EAL: Heap on socket 0 was shrunk by 2MB 00:05:49.422 EAL: No shared files mode enabled, IPC is disabled 00:05:49.422 EAL: No shared files mode enabled, IPC is disabled 00:05:49.422 EAL: No shared files mode enabled, IPC is disabled 00:05:49.422 00:05:49.422 real 0m1.183s 00:05:49.422 user 0m0.608s 00:05:49.422 sys 0m0.447s 00:05:49.422 15:42:57 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:49.422 15:42:57 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:49.422 ************************************ 00:05:49.422 END TEST env_vtophys 00:05:49.422 ************************************ 00:05:49.422 15:42:57 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:49.422 15:42:57 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:49.422 15:42:57 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:49.422 15:42:57 env -- common/autotest_common.sh@10 -- # set +x 00:05:49.422 ************************************ 00:05:49.422 START TEST env_pci 00:05:49.422 ************************************ 00:05:49.422 15:42:57 env.env_pci -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:49.422 00:05:49.422 00:05:49.422 CUnit - A unit testing framework for C - Version 2.1-3 00:05:49.422 http://cunit.sourceforge.net/ 00:05:49.422 00:05:49.422 00:05:49.422 Suite: pci 00:05:49.422 Test: pci_hook ...[2024-11-30 15:42:57.322714] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1118:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1694447 has claimed it 00:05:49.422 EAL: Cannot find device (10000:00:01.0) 00:05:49.422 EAL: Failed to attach device on primary process 00:05:49.422 passed 00:05:49.422 00:05:49.422 Run Summary: Type Total Ran Passed Failed Inactive 00:05:49.422 suites 1 1 n/a 0 0 00:05:49.422 tests 1 1 1 0 0 00:05:49.422 asserts 25 25 25 0 n/a 00:05:49.422 00:05:49.422 Elapsed time = 0.036 seconds 00:05:49.422 00:05:49.422 real 0m0.056s 00:05:49.422 user 0m0.011s 00:05:49.422 sys 0m0.044s 00:05:49.422 15:42:57 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:49.422 15:42:57 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:49.422 ************************************ 00:05:49.422 END TEST env_pci 00:05:49.422 ************************************ 00:05:49.681 15:42:57 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:49.681 15:42:57 env -- env/env.sh@15 -- # uname 00:05:49.681 15:42:57 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:49.681 15:42:57 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:49.681 15:42:57 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:49.681 15:42:57 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:05:49.682 15:42:57 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:49.682 15:42:57 env -- common/autotest_common.sh@10 -- # set +x 00:05:49.682 ************************************ 00:05:49.682 START TEST env_dpdk_post_init 00:05:49.682 ************************************ 00:05:49.682 15:42:57 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:49.682 EAL: Detected CPU lcores: 112 00:05:49.682 EAL: Detected NUMA nodes: 2 00:05:49.682 EAL: Detected static linkage of DPDK 00:05:49.682 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:49.682 EAL: Selected IOVA mode 'VA' 00:05:49.682 EAL: VFIO support initialized 00:05:49.941 EAL: Using IOMMU type 1 (Type 1) 00:05:55.239 Starting DPDK initialization... 00:05:55.239 Starting SPDK post initialization... 00:05:55.239 SPDK NVMe probe 00:05:55.239 Attaching to 0000:d8:00.0 00:05:55.239 Attached to 0000:d8:00.0 00:05:55.239 Cleaning up... 00:05:55.239 00:05:55.239 real 0m4.745s 00:05:55.239 user 0m3.234s 00:05:55.239 sys 0m0.654s 00:05:55.239 15:43:02 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:55.239 15:43:02 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:55.239 ************************************ 00:05:55.239 END TEST env_dpdk_post_init 00:05:55.239 ************************************ 00:05:55.239 15:43:02 env -- env/env.sh@26 -- # uname 00:05:55.239 15:43:02 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:55.239 15:43:02 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:55.239 15:43:02 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:55.239 15:43:02 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:55.239 15:43:02 env -- common/autotest_common.sh@10 -- # set +x 00:05:55.239 ************************************ 00:05:55.239 START TEST env_mem_callbacks 00:05:55.239 ************************************ 00:05:55.239 15:43:02 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:55.239 EAL: Detected CPU lcores: 112 00:05:55.239 EAL: Detected NUMA nodes: 2 00:05:55.239 EAL: Detected static linkage of DPDK 00:05:55.239 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:55.239 EAL: Selected IOVA mode 'VA' 00:05:55.239 EAL: VFIO support initialized 00:05:55.239 00:05:55.239 00:05:55.239 CUnit - A unit testing framework for C - Version 2.1-3 00:05:55.239 http://cunit.sourceforge.net/ 00:05:55.239 00:05:55.239 00:05:55.239 Suite: memory 00:05:55.239 Test: test ... 00:05:55.239 register 0x200000200000 2097152 00:05:55.239 malloc 3145728 00:05:55.239 register 0x200000400000 4194304 00:05:55.239 buf 0x200000500000 len 3145728 PASSED 00:05:55.239 malloc 64 00:05:55.239 buf 0x2000004fff40 len 64 PASSED 00:05:55.239 malloc 4194304 00:05:55.239 register 0x200000800000 6291456 00:05:55.239 buf 0x200000a00000 len 4194304 PASSED 00:05:55.239 free 0x200000500000 3145728 00:05:55.239 free 0x2000004fff40 64 00:05:55.239 unregister 0x200000400000 4194304 PASSED 00:05:55.239 free 0x200000a00000 4194304 00:05:55.239 unregister 0x200000800000 6291456 PASSED 00:05:55.239 malloc 8388608 00:05:55.239 register 0x200000400000 10485760 00:05:55.239 buf 0x200000600000 len 8388608 PASSED 00:05:55.239 free 0x200000600000 8388608 00:05:55.239 unregister 0x200000400000 10485760 PASSED 00:05:55.239 passed 00:05:55.239 00:05:55.239 Run Summary: Type Total Ran Passed Failed Inactive 00:05:55.239 suites 1 1 n/a 0 0 00:05:55.239 tests 1 1 1 0 0 00:05:55.239 asserts 15 15 15 0 n/a 00:05:55.239 00:05:55.239 Elapsed time = 0.005 seconds 00:05:55.239 00:05:55.239 real 0m0.165s 00:05:55.239 user 0m0.017s 00:05:55.239 sys 0m0.048s 00:05:55.239 15:43:02 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:55.239 15:43:02 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:55.239 ************************************ 00:05:55.239 END TEST env_mem_callbacks 00:05:55.239 ************************************ 00:05:55.239 00:05:55.239 real 0m6.845s 00:05:55.239 user 0m4.197s 00:05:55.239 sys 0m1.610s 00:05:55.239 15:43:02 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:55.239 15:43:02 env -- common/autotest_common.sh@10 -- # set +x 00:05:55.239 ************************************ 00:05:55.239 END TEST env 00:05:55.239 ************************************ 00:05:55.239 15:43:02 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:55.239 15:43:02 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:55.239 15:43:02 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:55.239 15:43:02 -- common/autotest_common.sh@10 -- # set +x 00:05:55.239 ************************************ 00:05:55.239 START TEST rpc 00:05:55.239 ************************************ 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:55.239 * Looking for test storage... 00:05:55.239 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:55.239 15:43:02 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:55.239 15:43:02 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:55.239 15:43:02 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:55.239 15:43:02 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:05:55.239 15:43:02 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:05:55.239 15:43:02 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:05:55.239 15:43:02 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:05:55.239 15:43:02 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:05:55.239 15:43:02 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:05:55.239 15:43:02 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:05:55.239 15:43:02 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:55.239 15:43:02 rpc -- scripts/common.sh@344 -- # case "$op" in 00:05:55.239 15:43:02 rpc -- scripts/common.sh@345 -- # : 1 00:05:55.239 15:43:02 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:55.239 15:43:02 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:55.239 15:43:02 rpc -- scripts/common.sh@365 -- # decimal 1 00:05:55.239 15:43:02 rpc -- scripts/common.sh@353 -- # local d=1 00:05:55.239 15:43:02 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:55.239 15:43:02 rpc -- scripts/common.sh@355 -- # echo 1 00:05:55.239 15:43:02 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:05:55.239 15:43:02 rpc -- scripts/common.sh@366 -- # decimal 2 00:05:55.239 15:43:02 rpc -- scripts/common.sh@353 -- # local d=2 00:05:55.239 15:43:02 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:55.239 15:43:02 rpc -- scripts/common.sh@355 -- # echo 2 00:05:55.239 15:43:02 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:05:55.239 15:43:02 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:55.239 15:43:02 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:55.239 15:43:02 rpc -- scripts/common.sh@368 -- # return 0 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:55.239 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:55.239 --rc genhtml_branch_coverage=1 00:05:55.239 --rc genhtml_function_coverage=1 00:05:55.239 --rc genhtml_legend=1 00:05:55.239 --rc geninfo_all_blocks=1 00:05:55.239 --rc geninfo_unexecuted_blocks=1 00:05:55.239 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:55.239 ' 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:55.239 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:55.239 --rc genhtml_branch_coverage=1 00:05:55.239 --rc genhtml_function_coverage=1 00:05:55.239 --rc genhtml_legend=1 00:05:55.239 --rc geninfo_all_blocks=1 00:05:55.239 --rc geninfo_unexecuted_blocks=1 00:05:55.239 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:55.239 ' 00:05:55.239 15:43:02 rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:55.239 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:55.239 --rc genhtml_branch_coverage=1 00:05:55.239 --rc genhtml_function_coverage=1 00:05:55.239 --rc genhtml_legend=1 00:05:55.239 --rc geninfo_all_blocks=1 00:05:55.239 --rc geninfo_unexecuted_blocks=1 00:05:55.240 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:55.240 ' 00:05:55.240 15:43:02 rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:55.240 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:55.240 --rc genhtml_branch_coverage=1 00:05:55.240 --rc genhtml_function_coverage=1 00:05:55.240 --rc genhtml_legend=1 00:05:55.240 --rc geninfo_all_blocks=1 00:05:55.240 --rc geninfo_unexecuted_blocks=1 00:05:55.240 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:55.240 ' 00:05:55.240 15:43:02 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1695383 00:05:55.240 15:43:02 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:55.240 15:43:02 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:55.240 15:43:02 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1695383 00:05:55.240 15:43:02 rpc -- common/autotest_common.sh@835 -- # '[' -z 1695383 ']' 00:05:55.240 15:43:02 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:55.240 15:43:02 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:55.240 15:43:02 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:55.240 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:55.240 15:43:02 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:55.240 15:43:02 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:55.240 [2024-11-30 15:43:02.747220] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:05:55.240 [2024-11-30 15:43:02.747287] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1695383 ] 00:05:55.240 [2024-11-30 15:43:02.883302] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:05:55.240 [2024-11-30 15:43:02.918702] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.240 [2024-11-30 15:43:02.940249] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:55.240 [2024-11-30 15:43:02.940286] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1695383' to capture a snapshot of events at runtime. 00:05:55.240 [2024-11-30 15:43:02.940295] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:55.240 [2024-11-30 15:43:02.940303] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:55.240 [2024-11-30 15:43:02.940326] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1695383 for offline analysis/debug. 00:05:55.240 [2024-11-30 15:43:02.940936] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.808 15:43:03 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:55.808 15:43:03 rpc -- common/autotest_common.sh@868 -- # return 0 00:05:55.808 15:43:03 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:55.808 15:43:03 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:55.808 15:43:03 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:55.808 15:43:03 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:55.808 15:43:03 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:55.808 15:43:03 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:55.808 15:43:03 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:55.808 ************************************ 00:05:55.808 START TEST rpc_integrity 00:05:55.808 ************************************ 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:55.808 { 00:05:55.808 "name": "Malloc0", 00:05:55.808 "aliases": [ 00:05:55.808 "83e93e6d-05b7-465c-9e62-54683124aad7" 00:05:55.808 ], 00:05:55.808 "product_name": "Malloc disk", 00:05:55.808 "block_size": 512, 00:05:55.808 "num_blocks": 16384, 00:05:55.808 "uuid": "83e93e6d-05b7-465c-9e62-54683124aad7", 00:05:55.808 "assigned_rate_limits": { 00:05:55.808 "rw_ios_per_sec": 0, 00:05:55.808 "rw_mbytes_per_sec": 0, 00:05:55.808 "r_mbytes_per_sec": 0, 00:05:55.808 "w_mbytes_per_sec": 0 00:05:55.808 }, 00:05:55.808 "claimed": false, 00:05:55.808 "zoned": false, 00:05:55.808 "supported_io_types": { 00:05:55.808 "read": true, 00:05:55.808 "write": true, 00:05:55.808 "unmap": true, 00:05:55.808 "flush": true, 00:05:55.808 "reset": true, 00:05:55.808 "nvme_admin": false, 00:05:55.808 "nvme_io": false, 00:05:55.808 "nvme_io_md": false, 00:05:55.808 "write_zeroes": true, 00:05:55.808 "zcopy": true, 00:05:55.808 "get_zone_info": false, 00:05:55.808 "zone_management": false, 00:05:55.808 "zone_append": false, 00:05:55.808 "compare": false, 00:05:55.808 "compare_and_write": false, 00:05:55.808 "abort": true, 00:05:55.808 "seek_hole": false, 00:05:55.808 "seek_data": false, 00:05:55.808 "copy": true, 00:05:55.808 "nvme_iov_md": false 00:05:55.808 }, 00:05:55.808 "memory_domains": [ 00:05:55.808 { 00:05:55.808 "dma_device_id": "system", 00:05:55.808 "dma_device_type": 1 00:05:55.808 }, 00:05:55.808 { 00:05:55.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:55.808 "dma_device_type": 2 00:05:55.808 } 00:05:55.808 ], 00:05:55.808 "driver_specific": {} 00:05:55.808 } 00:05:55.808 ]' 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:55.808 [2024-11-30 15:43:03.756276] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:55.808 [2024-11-30 15:43:03.756307] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:55.808 [2024-11-30 15:43:03.756324] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x624bec0 00:05:55.808 [2024-11-30 15:43:03.756333] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:55.808 [2024-11-30 15:43:03.757221] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:55.808 [2024-11-30 15:43:03.757242] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:55.808 Passthru0 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:55.808 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:55.808 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.067 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:56.067 { 00:05:56.067 "name": "Malloc0", 00:05:56.067 "aliases": [ 00:05:56.067 "83e93e6d-05b7-465c-9e62-54683124aad7" 00:05:56.067 ], 00:05:56.067 "product_name": "Malloc disk", 00:05:56.067 "block_size": 512, 00:05:56.067 "num_blocks": 16384, 00:05:56.067 "uuid": "83e93e6d-05b7-465c-9e62-54683124aad7", 00:05:56.067 "assigned_rate_limits": { 00:05:56.067 "rw_ios_per_sec": 0, 00:05:56.067 "rw_mbytes_per_sec": 0, 00:05:56.067 "r_mbytes_per_sec": 0, 00:05:56.067 "w_mbytes_per_sec": 0 00:05:56.067 }, 00:05:56.067 "claimed": true, 00:05:56.067 "claim_type": "exclusive_write", 00:05:56.067 "zoned": false, 00:05:56.067 "supported_io_types": { 00:05:56.067 "read": true, 00:05:56.067 "write": true, 00:05:56.067 "unmap": true, 00:05:56.067 "flush": true, 00:05:56.067 "reset": true, 00:05:56.067 "nvme_admin": false, 00:05:56.067 "nvme_io": false, 00:05:56.067 "nvme_io_md": false, 00:05:56.067 "write_zeroes": true, 00:05:56.067 "zcopy": true, 00:05:56.067 "get_zone_info": false, 00:05:56.067 "zone_management": false, 00:05:56.067 "zone_append": false, 00:05:56.067 "compare": false, 00:05:56.067 "compare_and_write": false, 00:05:56.067 "abort": true, 00:05:56.067 "seek_hole": false, 00:05:56.067 "seek_data": false, 00:05:56.067 "copy": true, 00:05:56.067 "nvme_iov_md": false 00:05:56.067 }, 00:05:56.067 "memory_domains": [ 00:05:56.067 { 00:05:56.067 "dma_device_id": "system", 00:05:56.067 "dma_device_type": 1 00:05:56.067 }, 00:05:56.067 { 00:05:56.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:56.067 "dma_device_type": 2 00:05:56.067 } 00:05:56.067 ], 00:05:56.067 "driver_specific": {} 00:05:56.067 }, 00:05:56.067 { 00:05:56.067 "name": "Passthru0", 00:05:56.067 "aliases": [ 00:05:56.067 "ac94da2b-2819-5ec3-9bc9-c5e2326781dc" 00:05:56.067 ], 00:05:56.067 "product_name": "passthru", 00:05:56.067 "block_size": 512, 00:05:56.067 "num_blocks": 16384, 00:05:56.067 "uuid": "ac94da2b-2819-5ec3-9bc9-c5e2326781dc", 00:05:56.067 "assigned_rate_limits": { 00:05:56.067 "rw_ios_per_sec": 0, 00:05:56.067 "rw_mbytes_per_sec": 0, 00:05:56.067 "r_mbytes_per_sec": 0, 00:05:56.067 "w_mbytes_per_sec": 0 00:05:56.067 }, 00:05:56.067 "claimed": false, 00:05:56.067 "zoned": false, 00:05:56.067 "supported_io_types": { 00:05:56.067 "read": true, 00:05:56.067 "write": true, 00:05:56.067 "unmap": true, 00:05:56.067 "flush": true, 00:05:56.067 "reset": true, 00:05:56.067 "nvme_admin": false, 00:05:56.067 "nvme_io": false, 00:05:56.067 "nvme_io_md": false, 00:05:56.067 "write_zeroes": true, 00:05:56.067 "zcopy": true, 00:05:56.067 "get_zone_info": false, 00:05:56.067 "zone_management": false, 00:05:56.067 "zone_append": false, 00:05:56.067 "compare": false, 00:05:56.067 "compare_and_write": false, 00:05:56.067 "abort": true, 00:05:56.067 "seek_hole": false, 00:05:56.067 "seek_data": false, 00:05:56.067 "copy": true, 00:05:56.067 "nvme_iov_md": false 00:05:56.067 }, 00:05:56.067 "memory_domains": [ 00:05:56.067 { 00:05:56.067 "dma_device_id": "system", 00:05:56.067 "dma_device_type": 1 00:05:56.067 }, 00:05:56.067 { 00:05:56.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:56.067 "dma_device_type": 2 00:05:56.067 } 00:05:56.067 ], 00:05:56.067 "driver_specific": { 00:05:56.067 "passthru": { 00:05:56.067 "name": "Passthru0", 00:05:56.067 "base_bdev_name": "Malloc0" 00:05:56.067 } 00:05:56.067 } 00:05:56.067 } 00:05:56.067 ]' 00:05:56.067 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:56.067 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:56.067 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.067 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.067 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.067 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.068 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:56.068 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:56.068 15:43:03 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:56.068 00:05:56.068 real 0m0.293s 00:05:56.068 user 0m0.180s 00:05:56.068 sys 0m0.054s 00:05:56.068 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:56.068 15:43:03 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.068 ************************************ 00:05:56.068 END TEST rpc_integrity 00:05:56.068 ************************************ 00:05:56.068 15:43:03 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:56.068 15:43:03 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:56.068 15:43:03 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:56.068 15:43:03 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.068 ************************************ 00:05:56.068 START TEST rpc_plugins 00:05:56.068 ************************************ 00:05:56.068 15:43:03 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:05:56.068 15:43:03 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:56.068 15:43:03 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.068 15:43:03 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.068 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.068 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:56.068 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:56.068 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.068 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.068 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.068 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:56.068 { 00:05:56.068 "name": "Malloc1", 00:05:56.068 "aliases": [ 00:05:56.068 "1c76096a-8014-41c3-9094-c920660e0ebf" 00:05:56.068 ], 00:05:56.068 "product_name": "Malloc disk", 00:05:56.068 "block_size": 4096, 00:05:56.068 "num_blocks": 256, 00:05:56.068 "uuid": "1c76096a-8014-41c3-9094-c920660e0ebf", 00:05:56.068 "assigned_rate_limits": { 00:05:56.068 "rw_ios_per_sec": 0, 00:05:56.068 "rw_mbytes_per_sec": 0, 00:05:56.068 "r_mbytes_per_sec": 0, 00:05:56.068 "w_mbytes_per_sec": 0 00:05:56.068 }, 00:05:56.068 "claimed": false, 00:05:56.068 "zoned": false, 00:05:56.068 "supported_io_types": { 00:05:56.068 "read": true, 00:05:56.068 "write": true, 00:05:56.068 "unmap": true, 00:05:56.068 "flush": true, 00:05:56.068 "reset": true, 00:05:56.068 "nvme_admin": false, 00:05:56.068 "nvme_io": false, 00:05:56.068 "nvme_io_md": false, 00:05:56.068 "write_zeroes": true, 00:05:56.068 "zcopy": true, 00:05:56.068 "get_zone_info": false, 00:05:56.068 "zone_management": false, 00:05:56.068 "zone_append": false, 00:05:56.068 "compare": false, 00:05:56.068 "compare_and_write": false, 00:05:56.068 "abort": true, 00:05:56.068 "seek_hole": false, 00:05:56.068 "seek_data": false, 00:05:56.068 "copy": true, 00:05:56.068 "nvme_iov_md": false 00:05:56.068 }, 00:05:56.068 "memory_domains": [ 00:05:56.068 { 00:05:56.068 "dma_device_id": "system", 00:05:56.068 "dma_device_type": 1 00:05:56.068 }, 00:05:56.068 { 00:05:56.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:56.068 "dma_device_type": 2 00:05:56.068 } 00:05:56.068 ], 00:05:56.068 "driver_specific": {} 00:05:56.068 } 00:05:56.068 ]' 00:05:56.328 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:56.328 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:56.328 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:56.328 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.328 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.328 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.328 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:56.328 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.328 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.328 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.328 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:56.328 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:56.328 15:43:04 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:56.328 00:05:56.328 real 0m0.154s 00:05:56.328 user 0m0.100s 00:05:56.328 sys 0m0.019s 00:05:56.328 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:56.328 15:43:04 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:56.328 ************************************ 00:05:56.328 END TEST rpc_plugins 00:05:56.328 ************************************ 00:05:56.328 15:43:04 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:56.328 15:43:04 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:56.328 15:43:04 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:56.328 15:43:04 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.328 ************************************ 00:05:56.328 START TEST rpc_trace_cmd_test 00:05:56.328 ************************************ 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:56.328 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1695383", 00:05:56.328 "tpoint_group_mask": "0x8", 00:05:56.328 "iscsi_conn": { 00:05:56.328 "mask": "0x2", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "scsi": { 00:05:56.328 "mask": "0x4", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "bdev": { 00:05:56.328 "mask": "0x8", 00:05:56.328 "tpoint_mask": "0xffffffffffffffff" 00:05:56.328 }, 00:05:56.328 "nvmf_rdma": { 00:05:56.328 "mask": "0x10", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "nvmf_tcp": { 00:05:56.328 "mask": "0x20", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "ftl": { 00:05:56.328 "mask": "0x40", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "blobfs": { 00:05:56.328 "mask": "0x80", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "dsa": { 00:05:56.328 "mask": "0x200", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "thread": { 00:05:56.328 "mask": "0x400", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "nvme_pcie": { 00:05:56.328 "mask": "0x800", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "iaa": { 00:05:56.328 "mask": "0x1000", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "nvme_tcp": { 00:05:56.328 "mask": "0x2000", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "bdev_nvme": { 00:05:56.328 "mask": "0x4000", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "sock": { 00:05:56.328 "mask": "0x8000", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "blob": { 00:05:56.328 "mask": "0x10000", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "bdev_raid": { 00:05:56.328 "mask": "0x20000", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 }, 00:05:56.328 "scheduler": { 00:05:56.328 "mask": "0x40000", 00:05:56.328 "tpoint_mask": "0x0" 00:05:56.328 } 00:05:56.328 }' 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:05:56.328 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:56.587 00:05:56.587 real 0m0.227s 00:05:56.587 user 0m0.186s 00:05:56.587 sys 0m0.034s 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:56.587 15:43:04 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:56.587 ************************************ 00:05:56.587 END TEST rpc_trace_cmd_test 00:05:56.587 ************************************ 00:05:56.587 15:43:04 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:56.587 15:43:04 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:56.587 15:43:04 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:56.587 15:43:04 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:56.587 15:43:04 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:56.587 15:43:04 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:56.587 ************************************ 00:05:56.587 START TEST rpc_daemon_integrity 00:05:56.587 ************************************ 00:05:56.587 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:05:56.587 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:56.587 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.587 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.587 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.587 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:56.587 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:56.847 { 00:05:56.847 "name": "Malloc2", 00:05:56.847 "aliases": [ 00:05:56.847 "61ba6fba-13a6-46b2-a292-a4d7e763fa3b" 00:05:56.847 ], 00:05:56.847 "product_name": "Malloc disk", 00:05:56.847 "block_size": 512, 00:05:56.847 "num_blocks": 16384, 00:05:56.847 "uuid": "61ba6fba-13a6-46b2-a292-a4d7e763fa3b", 00:05:56.847 "assigned_rate_limits": { 00:05:56.847 "rw_ios_per_sec": 0, 00:05:56.847 "rw_mbytes_per_sec": 0, 00:05:56.847 "r_mbytes_per_sec": 0, 00:05:56.847 "w_mbytes_per_sec": 0 00:05:56.847 }, 00:05:56.847 "claimed": false, 00:05:56.847 "zoned": false, 00:05:56.847 "supported_io_types": { 00:05:56.847 "read": true, 00:05:56.847 "write": true, 00:05:56.847 "unmap": true, 00:05:56.847 "flush": true, 00:05:56.847 "reset": true, 00:05:56.847 "nvme_admin": false, 00:05:56.847 "nvme_io": false, 00:05:56.847 "nvme_io_md": false, 00:05:56.847 "write_zeroes": true, 00:05:56.847 "zcopy": true, 00:05:56.847 "get_zone_info": false, 00:05:56.847 "zone_management": false, 00:05:56.847 "zone_append": false, 00:05:56.847 "compare": false, 00:05:56.847 "compare_and_write": false, 00:05:56.847 "abort": true, 00:05:56.847 "seek_hole": false, 00:05:56.847 "seek_data": false, 00:05:56.847 "copy": true, 00:05:56.847 "nvme_iov_md": false 00:05:56.847 }, 00:05:56.847 "memory_domains": [ 00:05:56.847 { 00:05:56.847 "dma_device_id": "system", 00:05:56.847 "dma_device_type": 1 00:05:56.847 }, 00:05:56.847 { 00:05:56.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:56.847 "dma_device_type": 2 00:05:56.847 } 00:05:56.847 ], 00:05:56.847 "driver_specific": {} 00:05:56.847 } 00:05:56.847 ]' 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.847 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.847 [2024-11-30 15:43:04.660465] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:56.848 [2024-11-30 15:43:04.660494] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:56.848 [2024-11-30 15:43:04.660516] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x623ff00 00:05:56.848 [2024-11-30 15:43:04.660525] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:56.848 [2024-11-30 15:43:04.661331] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:56.848 [2024-11-30 15:43:04.661353] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:56.848 Passthru0 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:56.848 { 00:05:56.848 "name": "Malloc2", 00:05:56.848 "aliases": [ 00:05:56.848 "61ba6fba-13a6-46b2-a292-a4d7e763fa3b" 00:05:56.848 ], 00:05:56.848 "product_name": "Malloc disk", 00:05:56.848 "block_size": 512, 00:05:56.848 "num_blocks": 16384, 00:05:56.848 "uuid": "61ba6fba-13a6-46b2-a292-a4d7e763fa3b", 00:05:56.848 "assigned_rate_limits": { 00:05:56.848 "rw_ios_per_sec": 0, 00:05:56.848 "rw_mbytes_per_sec": 0, 00:05:56.848 "r_mbytes_per_sec": 0, 00:05:56.848 "w_mbytes_per_sec": 0 00:05:56.848 }, 00:05:56.848 "claimed": true, 00:05:56.848 "claim_type": "exclusive_write", 00:05:56.848 "zoned": false, 00:05:56.848 "supported_io_types": { 00:05:56.848 "read": true, 00:05:56.848 "write": true, 00:05:56.848 "unmap": true, 00:05:56.848 "flush": true, 00:05:56.848 "reset": true, 00:05:56.848 "nvme_admin": false, 00:05:56.848 "nvme_io": false, 00:05:56.848 "nvme_io_md": false, 00:05:56.848 "write_zeroes": true, 00:05:56.848 "zcopy": true, 00:05:56.848 "get_zone_info": false, 00:05:56.848 "zone_management": false, 00:05:56.848 "zone_append": false, 00:05:56.848 "compare": false, 00:05:56.848 "compare_and_write": false, 00:05:56.848 "abort": true, 00:05:56.848 "seek_hole": false, 00:05:56.848 "seek_data": false, 00:05:56.848 "copy": true, 00:05:56.848 "nvme_iov_md": false 00:05:56.848 }, 00:05:56.848 "memory_domains": [ 00:05:56.848 { 00:05:56.848 "dma_device_id": "system", 00:05:56.848 "dma_device_type": 1 00:05:56.848 }, 00:05:56.848 { 00:05:56.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:56.848 "dma_device_type": 2 00:05:56.848 } 00:05:56.848 ], 00:05:56.848 "driver_specific": {} 00:05:56.848 }, 00:05:56.848 { 00:05:56.848 "name": "Passthru0", 00:05:56.848 "aliases": [ 00:05:56.848 "639c3b70-2aeb-5dd8-9060-e119e64effac" 00:05:56.848 ], 00:05:56.848 "product_name": "passthru", 00:05:56.848 "block_size": 512, 00:05:56.848 "num_blocks": 16384, 00:05:56.848 "uuid": "639c3b70-2aeb-5dd8-9060-e119e64effac", 00:05:56.848 "assigned_rate_limits": { 00:05:56.848 "rw_ios_per_sec": 0, 00:05:56.848 "rw_mbytes_per_sec": 0, 00:05:56.848 "r_mbytes_per_sec": 0, 00:05:56.848 "w_mbytes_per_sec": 0 00:05:56.848 }, 00:05:56.848 "claimed": false, 00:05:56.848 "zoned": false, 00:05:56.848 "supported_io_types": { 00:05:56.848 "read": true, 00:05:56.848 "write": true, 00:05:56.848 "unmap": true, 00:05:56.848 "flush": true, 00:05:56.848 "reset": true, 00:05:56.848 "nvme_admin": false, 00:05:56.848 "nvme_io": false, 00:05:56.848 "nvme_io_md": false, 00:05:56.848 "write_zeroes": true, 00:05:56.848 "zcopy": true, 00:05:56.848 "get_zone_info": false, 00:05:56.848 "zone_management": false, 00:05:56.848 "zone_append": false, 00:05:56.848 "compare": false, 00:05:56.848 "compare_and_write": false, 00:05:56.848 "abort": true, 00:05:56.848 "seek_hole": false, 00:05:56.848 "seek_data": false, 00:05:56.848 "copy": true, 00:05:56.848 "nvme_iov_md": false 00:05:56.848 }, 00:05:56.848 "memory_domains": [ 00:05:56.848 { 00:05:56.848 "dma_device_id": "system", 00:05:56.848 "dma_device_type": 1 00:05:56.848 }, 00:05:56.848 { 00:05:56.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:56.848 "dma_device_type": 2 00:05:56.848 } 00:05:56.848 ], 00:05:56.848 "driver_specific": { 00:05:56.848 "passthru": { 00:05:56.848 "name": "Passthru0", 00:05:56.848 "base_bdev_name": "Malloc2" 00:05:56.848 } 00:05:56.848 } 00:05:56.848 } 00:05:56.848 ]' 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:56.848 00:05:56.848 real 0m0.282s 00:05:56.848 user 0m0.170s 00:05:56.848 sys 0m0.048s 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:56.848 15:43:04 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:56.848 ************************************ 00:05:56.848 END TEST rpc_daemon_integrity 00:05:56.848 ************************************ 00:05:57.108 15:43:04 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:57.108 15:43:04 rpc -- rpc/rpc.sh@84 -- # killprocess 1695383 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@954 -- # '[' -z 1695383 ']' 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@958 -- # kill -0 1695383 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@959 -- # uname 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1695383 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1695383' 00:05:57.108 killing process with pid 1695383 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@973 -- # kill 1695383 00:05:57.108 15:43:04 rpc -- common/autotest_common.sh@978 -- # wait 1695383 00:05:57.375 00:05:57.375 real 0m2.649s 00:05:57.375 user 0m3.237s 00:05:57.375 sys 0m0.846s 00:05:57.375 15:43:05 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:57.375 15:43:05 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:57.375 ************************************ 00:05:57.375 END TEST rpc 00:05:57.375 ************************************ 00:05:57.375 15:43:05 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:57.375 15:43:05 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:57.375 15:43:05 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:57.375 15:43:05 -- common/autotest_common.sh@10 -- # set +x 00:05:57.375 ************************************ 00:05:57.375 START TEST skip_rpc 00:05:57.375 ************************************ 00:05:57.375 15:43:05 skip_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:57.636 * Looking for test storage... 00:05:57.636 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@345 -- # : 1 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:57.636 15:43:05 skip_rpc -- scripts/common.sh@368 -- # return 0 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:57.636 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:57.636 --rc genhtml_branch_coverage=1 00:05:57.636 --rc genhtml_function_coverage=1 00:05:57.636 --rc genhtml_legend=1 00:05:57.636 --rc geninfo_all_blocks=1 00:05:57.636 --rc geninfo_unexecuted_blocks=1 00:05:57.636 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:57.636 ' 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:57.636 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:57.636 --rc genhtml_branch_coverage=1 00:05:57.636 --rc genhtml_function_coverage=1 00:05:57.636 --rc genhtml_legend=1 00:05:57.636 --rc geninfo_all_blocks=1 00:05:57.636 --rc geninfo_unexecuted_blocks=1 00:05:57.636 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:57.636 ' 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:57.636 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:57.636 --rc genhtml_branch_coverage=1 00:05:57.636 --rc genhtml_function_coverage=1 00:05:57.636 --rc genhtml_legend=1 00:05:57.636 --rc geninfo_all_blocks=1 00:05:57.636 --rc geninfo_unexecuted_blocks=1 00:05:57.636 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:57.636 ' 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:57.636 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:57.636 --rc genhtml_branch_coverage=1 00:05:57.636 --rc genhtml_function_coverage=1 00:05:57.636 --rc genhtml_legend=1 00:05:57.636 --rc geninfo_all_blocks=1 00:05:57.636 --rc geninfo_unexecuted_blocks=1 00:05:57.636 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:57.636 ' 00:05:57.636 15:43:05 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:57.636 15:43:05 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:05:57.636 15:43:05 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:57.636 15:43:05 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:57.636 ************************************ 00:05:57.636 START TEST skip_rpc 00:05:57.636 ************************************ 00:05:57.636 15:43:05 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:05:57.636 15:43:05 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1696089 00:05:57.636 15:43:05 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:57.636 15:43:05 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:57.636 15:43:05 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:57.636 [2024-11-30 15:43:05.526009] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:05:57.636 [2024-11-30 15:43:05.526084] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1696089 ] 00:05:57.896 [2024-11-30 15:43:05.662254] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:05:57.896 [2024-11-30 15:43:05.696391] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.896 [2024-11-30 15:43:05.720829] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:03.166 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1696089 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 1696089 ']' 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 1696089 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1696089 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1696089' 00:06:03.167 killing process with pid 1696089 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 1696089 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 1696089 00:06:03.167 00:06:03.167 real 0m5.357s 00:06:03.167 user 0m5.019s 00:06:03.167 sys 0m0.284s 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:03.167 15:43:10 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.167 ************************************ 00:06:03.167 END TEST skip_rpc 00:06:03.167 ************************************ 00:06:03.167 15:43:10 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:03.167 15:43:10 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:03.167 15:43:10 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:03.167 15:43:10 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.167 ************************************ 00:06:03.167 START TEST skip_rpc_with_json 00:06:03.167 ************************************ 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1697038 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1697038 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 1697038 ']' 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:03.167 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:03.167 15:43:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:03.167 [2024-11-30 15:43:10.966777] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:03.167 [2024-11-30 15:43:10.966858] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1697038 ] 00:06:03.167 [2024-11-30 15:43:11.103196] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:03.426 [2024-11-30 15:43:11.138961] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.426 [2024-11-30 15:43:11.161117] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.993 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:03.994 [2024-11-30 15:43:11.814746] nvmf_rpc.c:2706:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:03.994 request: 00:06:03.994 { 00:06:03.994 "trtype": "tcp", 00:06:03.994 "method": "nvmf_get_transports", 00:06:03.994 "req_id": 1 00:06:03.994 } 00:06:03.994 Got JSON-RPC error response 00:06:03.994 response: 00:06:03.994 { 00:06:03.994 "code": -19, 00:06:03.994 "message": "No such device" 00:06:03.994 } 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:03.994 [2024-11-30 15:43:11.822815] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:03.994 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:04.253 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:04.253 15:43:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:04.253 { 00:06:04.253 "subsystems": [ 00:06:04.253 { 00:06:04.253 "subsystem": "scheduler", 00:06:04.253 "config": [ 00:06:04.253 { 00:06:04.253 "method": "framework_set_scheduler", 00:06:04.253 "params": { 00:06:04.253 "name": "static" 00:06:04.253 } 00:06:04.253 } 00:06:04.253 ] 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "subsystem": "vmd", 00:06:04.253 "config": [] 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "subsystem": "sock", 00:06:04.253 "config": [ 00:06:04.253 { 00:06:04.253 "method": "sock_set_default_impl", 00:06:04.253 "params": { 00:06:04.253 "impl_name": "posix" 00:06:04.253 } 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "method": "sock_impl_set_options", 00:06:04.253 "params": { 00:06:04.253 "impl_name": "ssl", 00:06:04.253 "recv_buf_size": 4096, 00:06:04.253 "send_buf_size": 4096, 00:06:04.253 "enable_recv_pipe": true, 00:06:04.253 "enable_quickack": false, 00:06:04.253 "enable_placement_id": 0, 00:06:04.253 "enable_zerocopy_send_server": true, 00:06:04.253 "enable_zerocopy_send_client": false, 00:06:04.253 "zerocopy_threshold": 0, 00:06:04.253 "tls_version": 0, 00:06:04.253 "enable_ktls": false 00:06:04.253 } 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "method": "sock_impl_set_options", 00:06:04.253 "params": { 00:06:04.253 "impl_name": "posix", 00:06:04.253 "recv_buf_size": 2097152, 00:06:04.253 "send_buf_size": 2097152, 00:06:04.253 "enable_recv_pipe": true, 00:06:04.253 "enable_quickack": false, 00:06:04.253 "enable_placement_id": 0, 00:06:04.253 "enable_zerocopy_send_server": true, 00:06:04.253 "enable_zerocopy_send_client": false, 00:06:04.253 "zerocopy_threshold": 0, 00:06:04.253 "tls_version": 0, 00:06:04.253 "enable_ktls": false 00:06:04.253 } 00:06:04.253 } 00:06:04.253 ] 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "subsystem": "iobuf", 00:06:04.253 "config": [ 00:06:04.253 { 00:06:04.253 "method": "iobuf_set_options", 00:06:04.253 "params": { 00:06:04.253 "small_pool_count": 8192, 00:06:04.253 "large_pool_count": 1024, 00:06:04.253 "small_bufsize": 8192, 00:06:04.253 "large_bufsize": 135168, 00:06:04.253 "enable_numa": false 00:06:04.253 } 00:06:04.253 } 00:06:04.253 ] 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "subsystem": "keyring", 00:06:04.253 "config": [] 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "subsystem": "vfio_user_target", 00:06:04.253 "config": null 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "subsystem": "fsdev", 00:06:04.253 "config": [ 00:06:04.253 { 00:06:04.253 "method": "fsdev_set_opts", 00:06:04.253 "params": { 00:06:04.253 "fsdev_io_pool_size": 65535, 00:06:04.253 "fsdev_io_cache_size": 256 00:06:04.253 } 00:06:04.253 } 00:06:04.253 ] 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "subsystem": "accel", 00:06:04.253 "config": [ 00:06:04.253 { 00:06:04.253 "method": "accel_set_options", 00:06:04.253 "params": { 00:06:04.253 "small_cache_size": 128, 00:06:04.253 "large_cache_size": 16, 00:06:04.253 "task_count": 2048, 00:06:04.253 "sequence_count": 2048, 00:06:04.253 "buf_count": 2048 00:06:04.253 } 00:06:04.253 } 00:06:04.253 ] 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "subsystem": "bdev", 00:06:04.253 "config": [ 00:06:04.253 { 00:06:04.253 "method": "bdev_set_options", 00:06:04.253 "params": { 00:06:04.253 "bdev_io_pool_size": 65535, 00:06:04.253 "bdev_io_cache_size": 256, 00:06:04.253 "bdev_auto_examine": true, 00:06:04.253 "iobuf_small_cache_size": 128, 00:06:04.253 "iobuf_large_cache_size": 16 00:06:04.253 } 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "method": "bdev_raid_set_options", 00:06:04.253 "params": { 00:06:04.253 "process_window_size_kb": 1024, 00:06:04.253 "process_max_bandwidth_mb_sec": 0 00:06:04.253 } 00:06:04.253 }, 00:06:04.253 { 00:06:04.253 "method": "bdev_nvme_set_options", 00:06:04.253 "params": { 00:06:04.253 "action_on_timeout": "none", 00:06:04.253 "timeout_us": 0, 00:06:04.253 "timeout_admin_us": 0, 00:06:04.253 "keep_alive_timeout_ms": 10000, 00:06:04.253 "arbitration_burst": 0, 00:06:04.253 "low_priority_weight": 0, 00:06:04.253 "medium_priority_weight": 0, 00:06:04.253 "high_priority_weight": 0, 00:06:04.253 "nvme_adminq_poll_period_us": 10000, 00:06:04.253 "nvme_ioq_poll_period_us": 0, 00:06:04.253 "io_queue_requests": 0, 00:06:04.253 "delay_cmd_submit": true, 00:06:04.253 "transport_retry_count": 4, 00:06:04.253 "bdev_retry_count": 3, 00:06:04.253 "transport_ack_timeout": 0, 00:06:04.253 "ctrlr_loss_timeout_sec": 0, 00:06:04.253 "reconnect_delay_sec": 0, 00:06:04.253 "fast_io_fail_timeout_sec": 0, 00:06:04.253 "disable_auto_failback": false, 00:06:04.253 "generate_uuids": false, 00:06:04.253 "transport_tos": 0, 00:06:04.253 "nvme_error_stat": false, 00:06:04.253 "rdma_srq_size": 0, 00:06:04.253 "io_path_stat": false, 00:06:04.253 "allow_accel_sequence": false, 00:06:04.253 "rdma_max_cq_size": 0, 00:06:04.253 "rdma_cm_event_timeout_ms": 0, 00:06:04.253 "dhchap_digests": [ 00:06:04.253 "sha256", 00:06:04.253 "sha384", 00:06:04.254 "sha512" 00:06:04.254 ], 00:06:04.254 "dhchap_dhgroups": [ 00:06:04.254 "null", 00:06:04.254 "ffdhe2048", 00:06:04.254 "ffdhe3072", 00:06:04.254 "ffdhe4096", 00:06:04.254 "ffdhe6144", 00:06:04.254 "ffdhe8192" 00:06:04.254 ] 00:06:04.254 } 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "method": "bdev_nvme_set_hotplug", 00:06:04.254 "params": { 00:06:04.254 "period_us": 100000, 00:06:04.254 "enable": false 00:06:04.254 } 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "method": "bdev_iscsi_set_options", 00:06:04.254 "params": { 00:06:04.254 "timeout_sec": 30 00:06:04.254 } 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "method": "bdev_wait_for_examine" 00:06:04.254 } 00:06:04.254 ] 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "subsystem": "nvmf", 00:06:04.254 "config": [ 00:06:04.254 { 00:06:04.254 "method": "nvmf_set_config", 00:06:04.254 "params": { 00:06:04.254 "discovery_filter": "match_any", 00:06:04.254 "admin_cmd_passthru": { 00:06:04.254 "identify_ctrlr": false 00:06:04.254 }, 00:06:04.254 "dhchap_digests": [ 00:06:04.254 "sha256", 00:06:04.254 "sha384", 00:06:04.254 "sha512" 00:06:04.254 ], 00:06:04.254 "dhchap_dhgroups": [ 00:06:04.254 "null", 00:06:04.254 "ffdhe2048", 00:06:04.254 "ffdhe3072", 00:06:04.254 "ffdhe4096", 00:06:04.254 "ffdhe6144", 00:06:04.254 "ffdhe8192" 00:06:04.254 ] 00:06:04.254 } 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "method": "nvmf_set_max_subsystems", 00:06:04.254 "params": { 00:06:04.254 "max_subsystems": 1024 00:06:04.254 } 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "method": "nvmf_set_crdt", 00:06:04.254 "params": { 00:06:04.254 "crdt1": 0, 00:06:04.254 "crdt2": 0, 00:06:04.254 "crdt3": 0 00:06:04.254 } 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "method": "nvmf_create_transport", 00:06:04.254 "params": { 00:06:04.254 "trtype": "TCP", 00:06:04.254 "max_queue_depth": 128, 00:06:04.254 "max_io_qpairs_per_ctrlr": 127, 00:06:04.254 "in_capsule_data_size": 4096, 00:06:04.254 "max_io_size": 131072, 00:06:04.254 "io_unit_size": 131072, 00:06:04.254 "max_aq_depth": 128, 00:06:04.254 "num_shared_buffers": 511, 00:06:04.254 "buf_cache_size": 4294967295, 00:06:04.254 "dif_insert_or_strip": false, 00:06:04.254 "zcopy": false, 00:06:04.254 "c2h_success": true, 00:06:04.254 "sock_priority": 0, 00:06:04.254 "abort_timeout_sec": 1, 00:06:04.254 "ack_timeout": 0, 00:06:04.254 "data_wr_pool_size": 0 00:06:04.254 } 00:06:04.254 } 00:06:04.254 ] 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "subsystem": "nbd", 00:06:04.254 "config": [] 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "subsystem": "ublk", 00:06:04.254 "config": [] 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "subsystem": "vhost_blk", 00:06:04.254 "config": [] 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "subsystem": "scsi", 00:06:04.254 "config": null 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "subsystem": "iscsi", 00:06:04.254 "config": [ 00:06:04.254 { 00:06:04.254 "method": "iscsi_set_options", 00:06:04.254 "params": { 00:06:04.254 "node_base": "iqn.2016-06.io.spdk", 00:06:04.254 "max_sessions": 128, 00:06:04.254 "max_connections_per_session": 2, 00:06:04.254 "max_queue_depth": 64, 00:06:04.254 "default_time2wait": 2, 00:06:04.254 "default_time2retain": 20, 00:06:04.254 "first_burst_length": 8192, 00:06:04.254 "immediate_data": true, 00:06:04.254 "allow_duplicated_isid": false, 00:06:04.254 "error_recovery_level": 0, 00:06:04.254 "nop_timeout": 60, 00:06:04.254 "nop_in_interval": 30, 00:06:04.254 "disable_chap": false, 00:06:04.254 "require_chap": false, 00:06:04.254 "mutual_chap": false, 00:06:04.254 "chap_group": 0, 00:06:04.254 "max_large_datain_per_connection": 64, 00:06:04.254 "max_r2t_per_connection": 4, 00:06:04.254 "pdu_pool_size": 36864, 00:06:04.254 "immediate_data_pool_size": 16384, 00:06:04.254 "data_out_pool_size": 2048 00:06:04.254 } 00:06:04.254 } 00:06:04.254 ] 00:06:04.254 }, 00:06:04.254 { 00:06:04.254 "subsystem": "vhost_scsi", 00:06:04.254 "config": [] 00:06:04.254 } 00:06:04.254 ] 00:06:04.254 } 00:06:04.254 15:43:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:04.254 15:43:11 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1697038 00:06:04.254 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 1697038 ']' 00:06:04.254 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 1697038 00:06:04.254 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:04.254 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:04.254 15:43:11 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1697038 00:06:04.254 15:43:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:04.254 15:43:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:04.254 15:43:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1697038' 00:06:04.254 killing process with pid 1697038 00:06:04.254 15:43:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 1697038 00:06:04.254 15:43:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 1697038 00:06:04.514 15:43:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1697234 00:06:04.514 15:43:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:04.514 15:43:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1697234 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 1697234 ']' 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 1697234 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1697234 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1697234' 00:06:09.780 killing process with pid 1697234 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 1697234 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 1697234 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:09.780 00:06:09.780 real 0m6.756s 00:06:09.780 user 0m6.371s 00:06:09.780 sys 0m0.661s 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:09.780 15:43:17 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:09.780 ************************************ 00:06:09.780 END TEST skip_rpc_with_json 00:06:09.780 ************************************ 00:06:09.780 15:43:17 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:09.780 15:43:17 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:09.780 15:43:17 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:09.780 15:43:17 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:10.039 ************************************ 00:06:10.039 START TEST skip_rpc_with_delay 00:06:10.039 ************************************ 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:10.039 [2024-11-30 15:43:17.808646] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:10.039 00:06:10.039 real 0m0.046s 00:06:10.039 user 0m0.022s 00:06:10.039 sys 0m0.024s 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:10.039 15:43:17 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:10.039 ************************************ 00:06:10.039 END TEST skip_rpc_with_delay 00:06:10.039 ************************************ 00:06:10.040 15:43:17 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:10.040 15:43:17 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:10.040 15:43:17 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:10.040 15:43:17 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:10.040 15:43:17 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:10.040 15:43:17 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:10.040 ************************************ 00:06:10.040 START TEST exit_on_failed_rpc_init 00:06:10.040 ************************************ 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1698310 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1698310 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 1698310 ']' 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.040 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:10.040 15:43:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:10.040 [2024-11-30 15:43:17.939524] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:10.040 [2024-11-30 15:43:17.939588] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1698310 ] 00:06:10.299 [2024-11-30 15:43:18.076357] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:10.299 [2024-11-30 15:43:18.112067] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.299 [2024-11-30 15:43:18.132893] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:10.867 15:43:18 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:10.867 [2024-11-30 15:43:18.829064] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:10.867 [2024-11-30 15:43:18.829128] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1698486 ] 00:06:11.125 [2024-11-30 15:43:18.964434] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:11.125 [2024-11-30 15:43:18.996393] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.125 [2024-11-30 15:43:19.018866] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:11.125 [2024-11-30 15:43:19.018941] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:11.125 [2024-11-30 15:43:19.018954] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:11.125 [2024-11-30 15:43:19.018962] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:11.125 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:06:11.125 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:11.125 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:06:11.125 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1698310 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 1698310 ']' 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 1698310 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:11.126 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1698310 00:06:11.384 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:11.384 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:11.384 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1698310' 00:06:11.384 killing process with pid 1698310 00:06:11.384 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 1698310 00:06:11.384 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 1698310 00:06:11.643 00:06:11.643 real 0m1.490s 00:06:11.643 user 0m1.541s 00:06:11.643 sys 0m0.433s 00:06:11.643 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:11.643 15:43:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:11.643 ************************************ 00:06:11.643 END TEST exit_on_failed_rpc_init 00:06:11.643 ************************************ 00:06:11.643 15:43:19 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:11.643 00:06:11.643 real 0m14.171s 00:06:11.643 user 0m13.179s 00:06:11.643 sys 0m1.739s 00:06:11.643 15:43:19 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:11.643 15:43:19 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:11.643 ************************************ 00:06:11.643 END TEST skip_rpc 00:06:11.643 ************************************ 00:06:11.643 15:43:19 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:11.643 15:43:19 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:11.643 15:43:19 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:11.643 15:43:19 -- common/autotest_common.sh@10 -- # set +x 00:06:11.643 ************************************ 00:06:11.643 START TEST rpc_client 00:06:11.643 ************************************ 00:06:11.643 15:43:19 rpc_client -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:11.902 * Looking for test storage... 00:06:11.902 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1693 -- # lcov --version 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@345 -- # : 1 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@353 -- # local d=1 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@355 -- # echo 1 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@353 -- # local d=2 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@355 -- # echo 2 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:11.902 15:43:19 rpc_client -- scripts/common.sh@368 -- # return 0 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:11.902 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:11.902 --rc genhtml_branch_coverage=1 00:06:11.902 --rc genhtml_function_coverage=1 00:06:11.902 --rc genhtml_legend=1 00:06:11.902 --rc geninfo_all_blocks=1 00:06:11.902 --rc geninfo_unexecuted_blocks=1 00:06:11.902 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:11.902 ' 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:11.902 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:11.902 --rc genhtml_branch_coverage=1 00:06:11.902 --rc genhtml_function_coverage=1 00:06:11.902 --rc genhtml_legend=1 00:06:11.902 --rc geninfo_all_blocks=1 00:06:11.902 --rc geninfo_unexecuted_blocks=1 00:06:11.902 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:11.902 ' 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:11.902 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:11.902 --rc genhtml_branch_coverage=1 00:06:11.902 --rc genhtml_function_coverage=1 00:06:11.902 --rc genhtml_legend=1 00:06:11.902 --rc geninfo_all_blocks=1 00:06:11.902 --rc geninfo_unexecuted_blocks=1 00:06:11.902 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:11.902 ' 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:11.902 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:11.902 --rc genhtml_branch_coverage=1 00:06:11.902 --rc genhtml_function_coverage=1 00:06:11.902 --rc genhtml_legend=1 00:06:11.902 --rc geninfo_all_blocks=1 00:06:11.902 --rc geninfo_unexecuted_blocks=1 00:06:11.902 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:11.902 ' 00:06:11.902 15:43:19 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:11.902 OK 00:06:11.902 15:43:19 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:11.902 00:06:11.902 real 0m0.216s 00:06:11.902 user 0m0.109s 00:06:11.902 sys 0m0.124s 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:11.902 15:43:19 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:11.902 ************************************ 00:06:11.902 END TEST rpc_client 00:06:11.902 ************************************ 00:06:11.902 15:43:19 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:11.902 15:43:19 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:11.902 15:43:19 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:11.902 15:43:19 -- common/autotest_common.sh@10 -- # set +x 00:06:11.902 ************************************ 00:06:11.902 START TEST json_config 00:06:11.902 ************************************ 00:06:11.902 15:43:19 json_config -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:12.162 15:43:19 json_config -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:12.162 15:43:19 json_config -- common/autotest_common.sh@1693 -- # lcov --version 00:06:12.162 15:43:19 json_config -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:12.162 15:43:19 json_config -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:12.162 15:43:19 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:12.162 15:43:19 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:12.162 15:43:19 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:12.162 15:43:19 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:06:12.162 15:43:19 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:06:12.162 15:43:19 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:06:12.162 15:43:19 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:06:12.162 15:43:19 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:06:12.162 15:43:19 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:06:12.162 15:43:19 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:06:12.162 15:43:19 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:12.162 15:43:19 json_config -- scripts/common.sh@344 -- # case "$op" in 00:06:12.162 15:43:19 json_config -- scripts/common.sh@345 -- # : 1 00:06:12.162 15:43:19 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:12.162 15:43:19 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:12.162 15:43:19 json_config -- scripts/common.sh@365 -- # decimal 1 00:06:12.162 15:43:19 json_config -- scripts/common.sh@353 -- # local d=1 00:06:12.162 15:43:19 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:12.162 15:43:19 json_config -- scripts/common.sh@355 -- # echo 1 00:06:12.162 15:43:19 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:06:12.162 15:43:19 json_config -- scripts/common.sh@366 -- # decimal 2 00:06:12.162 15:43:19 json_config -- scripts/common.sh@353 -- # local d=2 00:06:12.162 15:43:19 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:12.162 15:43:19 json_config -- scripts/common.sh@355 -- # echo 2 00:06:12.162 15:43:19 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:06:12.162 15:43:19 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:12.163 15:43:19 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:12.163 15:43:19 json_config -- scripts/common.sh@368 -- # return 0 00:06:12.163 15:43:19 json_config -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:12.163 15:43:19 json_config -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:12.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.163 --rc genhtml_branch_coverage=1 00:06:12.163 --rc genhtml_function_coverage=1 00:06:12.163 --rc genhtml_legend=1 00:06:12.163 --rc geninfo_all_blocks=1 00:06:12.163 --rc geninfo_unexecuted_blocks=1 00:06:12.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.163 ' 00:06:12.163 15:43:19 json_config -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:12.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.163 --rc genhtml_branch_coverage=1 00:06:12.163 --rc genhtml_function_coverage=1 00:06:12.163 --rc genhtml_legend=1 00:06:12.163 --rc geninfo_all_blocks=1 00:06:12.163 --rc geninfo_unexecuted_blocks=1 00:06:12.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.163 ' 00:06:12.163 15:43:19 json_config -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:12.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.163 --rc genhtml_branch_coverage=1 00:06:12.163 --rc genhtml_function_coverage=1 00:06:12.163 --rc genhtml_legend=1 00:06:12.163 --rc geninfo_all_blocks=1 00:06:12.163 --rc geninfo_unexecuted_blocks=1 00:06:12.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.163 ' 00:06:12.163 15:43:19 json_config -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:12.163 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.163 --rc genhtml_branch_coverage=1 00:06:12.163 --rc genhtml_function_coverage=1 00:06:12.163 --rc genhtml_legend=1 00:06:12.163 --rc geninfo_all_blocks=1 00:06:12.163 --rc geninfo_unexecuted_blocks=1 00:06:12.163 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.163 ' 00:06:12.163 15:43:19 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:12.163 15:43:19 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:12.163 15:43:20 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:06:12.163 15:43:20 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:12.163 15:43:20 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:12.163 15:43:20 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:12.163 15:43:20 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.163 15:43:20 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.163 15:43:20 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.163 15:43:20 json_config -- paths/export.sh@5 -- # export PATH 00:06:12.163 15:43:20 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@51 -- # : 0 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:12.163 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:12.163 15:43:20 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:12.163 15:43:20 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:12.163 15:43:20 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:12.163 15:43:20 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:12.163 15:43:20 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:12.163 15:43:20 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:12.163 15:43:20 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:12.163 WARNING: No tests are enabled so not running JSON configuration tests 00:06:12.163 15:43:20 json_config -- json_config/json_config.sh@28 -- # exit 0 00:06:12.163 00:06:12.163 real 0m0.202s 00:06:12.163 user 0m0.126s 00:06:12.163 sys 0m0.086s 00:06:12.163 15:43:20 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:12.163 15:43:20 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:12.163 ************************************ 00:06:12.163 END TEST json_config 00:06:12.163 ************************************ 00:06:12.163 15:43:20 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:12.163 15:43:20 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:12.163 15:43:20 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:12.163 15:43:20 -- common/autotest_common.sh@10 -- # set +x 00:06:12.163 ************************************ 00:06:12.163 START TEST json_config_extra_key 00:06:12.163 ************************************ 00:06:12.163 15:43:20 json_config_extra_key -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1693 -- # lcov --version 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:12.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.423 --rc genhtml_branch_coverage=1 00:06:12.423 --rc genhtml_function_coverage=1 00:06:12.423 --rc genhtml_legend=1 00:06:12.423 --rc geninfo_all_blocks=1 00:06:12.423 --rc geninfo_unexecuted_blocks=1 00:06:12.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.423 ' 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:12.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.423 --rc genhtml_branch_coverage=1 00:06:12.423 --rc genhtml_function_coverage=1 00:06:12.423 --rc genhtml_legend=1 00:06:12.423 --rc geninfo_all_blocks=1 00:06:12.423 --rc geninfo_unexecuted_blocks=1 00:06:12.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.423 ' 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:12.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.423 --rc genhtml_branch_coverage=1 00:06:12.423 --rc genhtml_function_coverage=1 00:06:12.423 --rc genhtml_legend=1 00:06:12.423 --rc geninfo_all_blocks=1 00:06:12.423 --rc geninfo_unexecuted_blocks=1 00:06:12.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.423 ' 00:06:12.423 15:43:20 json_config_extra_key -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:12.423 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.423 --rc genhtml_branch_coverage=1 00:06:12.423 --rc genhtml_function_coverage=1 00:06:12.423 --rc genhtml_legend=1 00:06:12.423 --rc geninfo_all_blocks=1 00:06:12.423 --rc geninfo_unexecuted_blocks=1 00:06:12.423 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.423 ' 00:06:12.423 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:12.423 15:43:20 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:12.423 15:43:20 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.423 15:43:20 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.423 15:43:20 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.423 15:43:20 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:12.423 15:43:20 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:12.423 15:43:20 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:12.424 15:43:20 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:12.424 15:43:20 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:12.424 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:12.424 15:43:20 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:12.424 15:43:20 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:12.424 15:43:20 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:12.424 INFO: launching applications... 00:06:12.424 15:43:20 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1698905 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:12.424 Waiting for target to run... 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1698905 /var/tmp/spdk_tgt.sock 00:06:12.424 15:43:20 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 1698905 ']' 00:06:12.424 15:43:20 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:12.424 15:43:20 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:12.424 15:43:20 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:12.424 15:43:20 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:12.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:12.424 15:43:20 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:12.424 15:43:20 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:12.424 [2024-11-30 15:43:20.336186] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:12.424 [2024-11-30 15:43:20.336253] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1698905 ] 00:06:12.991 [2024-11-30 15:43:20.833381] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:12.991 [2024-11-30 15:43:20.871150] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.991 [2024-11-30 15:43:20.895026] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.251 15:43:21 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:13.251 15:43:21 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:13.251 00:06:13.251 15:43:21 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:13.251 INFO: shutting down applications... 00:06:13.251 15:43:21 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1698905 ]] 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1698905 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1698905 00:06:13.251 15:43:21 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:13.904 15:43:21 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:13.904 15:43:21 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:13.904 15:43:21 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1698905 00:06:13.904 15:43:21 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:13.904 15:43:21 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:13.904 15:43:21 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:13.904 15:43:21 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:13.904 SPDK target shutdown done 00:06:13.904 15:43:21 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:13.904 Success 00:06:13.904 00:06:13.904 real 0m1.592s 00:06:13.904 user 0m1.096s 00:06:13.904 sys 0m0.561s 00:06:13.904 15:43:21 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:13.904 15:43:21 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:13.904 ************************************ 00:06:13.904 END TEST json_config_extra_key 00:06:13.904 ************************************ 00:06:13.904 15:43:21 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:13.904 15:43:21 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:13.904 15:43:21 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:13.904 15:43:21 -- common/autotest_common.sh@10 -- # set +x 00:06:13.904 ************************************ 00:06:13.904 START TEST alias_rpc 00:06:13.904 ************************************ 00:06:13.904 15:43:21 alias_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:14.202 * Looking for test storage... 00:06:14.202 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@345 -- # : 1 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:14.202 15:43:21 alias_rpc -- scripts/common.sh@368 -- # return 0 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:14.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.202 --rc genhtml_branch_coverage=1 00:06:14.202 --rc genhtml_function_coverage=1 00:06:14.202 --rc genhtml_legend=1 00:06:14.202 --rc geninfo_all_blocks=1 00:06:14.202 --rc geninfo_unexecuted_blocks=1 00:06:14.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.202 ' 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:14.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.202 --rc genhtml_branch_coverage=1 00:06:14.202 --rc genhtml_function_coverage=1 00:06:14.202 --rc genhtml_legend=1 00:06:14.202 --rc geninfo_all_blocks=1 00:06:14.202 --rc geninfo_unexecuted_blocks=1 00:06:14.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.202 ' 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:14.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.202 --rc genhtml_branch_coverage=1 00:06:14.202 --rc genhtml_function_coverage=1 00:06:14.202 --rc genhtml_legend=1 00:06:14.202 --rc geninfo_all_blocks=1 00:06:14.202 --rc geninfo_unexecuted_blocks=1 00:06:14.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.202 ' 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:14.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:14.202 --rc genhtml_branch_coverage=1 00:06:14.202 --rc genhtml_function_coverage=1 00:06:14.202 --rc genhtml_legend=1 00:06:14.202 --rc geninfo_all_blocks=1 00:06:14.202 --rc geninfo_unexecuted_blocks=1 00:06:14.202 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:14.202 ' 00:06:14.202 15:43:21 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:14.202 15:43:21 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1699290 00:06:14.202 15:43:21 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:14.202 15:43:21 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1699290 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 1699290 ']' 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:14.202 15:43:21 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:14.202 [2024-11-30 15:43:21.980214] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:14.202 [2024-11-30 15:43:21.980311] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1699290 ] 00:06:14.202 [2024-11-30 15:43:22.116620] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:14.514 [2024-11-30 15:43:22.152478] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.514 [2024-11-30 15:43:22.175064] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.089 15:43:22 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:15.089 15:43:22 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:06:15.089 15:43:22 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:15.089 15:43:23 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1699290 00:06:15.089 15:43:23 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 1699290 ']' 00:06:15.089 15:43:23 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 1699290 00:06:15.089 15:43:23 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:06:15.089 15:43:23 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:15.089 15:43:23 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1699290 00:06:15.348 15:43:23 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:15.348 15:43:23 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:15.348 15:43:23 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1699290' 00:06:15.348 killing process with pid 1699290 00:06:15.348 15:43:23 alias_rpc -- common/autotest_common.sh@973 -- # kill 1699290 00:06:15.348 15:43:23 alias_rpc -- common/autotest_common.sh@978 -- # wait 1699290 00:06:15.607 00:06:15.607 real 0m1.608s 00:06:15.607 user 0m1.647s 00:06:15.607 sys 0m0.472s 00:06:15.607 15:43:23 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:15.607 15:43:23 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:15.607 ************************************ 00:06:15.607 END TEST alias_rpc 00:06:15.607 ************************************ 00:06:15.607 15:43:23 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:06:15.607 15:43:23 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:15.607 15:43:23 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:15.607 15:43:23 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:15.607 15:43:23 -- common/autotest_common.sh@10 -- # set +x 00:06:15.607 ************************************ 00:06:15.607 START TEST spdkcli_tcp 00:06:15.607 ************************************ 00:06:15.607 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:15.607 * Looking for test storage... 00:06:15.607 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:15.607 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:15.607 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:15.607 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lcov --version 00:06:15.866 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:15.866 15:43:23 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:06:15.866 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:15.866 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:15.866 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:15.866 --rc genhtml_branch_coverage=1 00:06:15.866 --rc genhtml_function_coverage=1 00:06:15.866 --rc genhtml_legend=1 00:06:15.866 --rc geninfo_all_blocks=1 00:06:15.866 --rc geninfo_unexecuted_blocks=1 00:06:15.866 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:15.866 ' 00:06:15.866 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:15.866 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:15.866 --rc genhtml_branch_coverage=1 00:06:15.866 --rc genhtml_function_coverage=1 00:06:15.866 --rc genhtml_legend=1 00:06:15.866 --rc geninfo_all_blocks=1 00:06:15.866 --rc geninfo_unexecuted_blocks=1 00:06:15.866 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:15.866 ' 00:06:15.866 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:15.866 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:15.866 --rc genhtml_branch_coverage=1 00:06:15.866 --rc genhtml_function_coverage=1 00:06:15.867 --rc genhtml_legend=1 00:06:15.867 --rc geninfo_all_blocks=1 00:06:15.867 --rc geninfo_unexecuted_blocks=1 00:06:15.867 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:15.867 ' 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:15.867 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:15.867 --rc genhtml_branch_coverage=1 00:06:15.867 --rc genhtml_function_coverage=1 00:06:15.867 --rc genhtml_legend=1 00:06:15.867 --rc geninfo_all_blocks=1 00:06:15.867 --rc geninfo_unexecuted_blocks=1 00:06:15.867 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:15.867 ' 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1699677 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1699677 00:06:15.867 15:43:23 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 1699677 ']' 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:15.867 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:15.867 15:43:23 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:15.867 [2024-11-30 15:43:23.687098] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:15.867 [2024-11-30 15:43:23.687181] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1699677 ] 00:06:15.867 [2024-11-30 15:43:23.823521] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:16.125 [2024-11-30 15:43:23.857627] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:16.125 [2024-11-30 15:43:23.881250] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:16.125 [2024-11-30 15:43:23.881253] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.692 15:43:24 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:16.692 15:43:24 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:06:16.692 15:43:24 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1699701 00:06:16.692 15:43:24 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:16.692 15:43:24 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:16.952 [ 00:06:16.952 "spdk_get_version", 00:06:16.952 "rpc_get_methods", 00:06:16.952 "notify_get_notifications", 00:06:16.952 "notify_get_types", 00:06:16.952 "trace_get_info", 00:06:16.952 "trace_get_tpoint_group_mask", 00:06:16.952 "trace_disable_tpoint_group", 00:06:16.952 "trace_enable_tpoint_group", 00:06:16.952 "trace_clear_tpoint_mask", 00:06:16.952 "trace_set_tpoint_mask", 00:06:16.952 "fsdev_set_opts", 00:06:16.952 "fsdev_get_opts", 00:06:16.952 "framework_get_pci_devices", 00:06:16.952 "framework_get_config", 00:06:16.952 "framework_get_subsystems", 00:06:16.952 "vfu_tgt_set_base_path", 00:06:16.952 "keyring_get_keys", 00:06:16.952 "iobuf_get_stats", 00:06:16.952 "iobuf_set_options", 00:06:16.952 "sock_get_default_impl", 00:06:16.952 "sock_set_default_impl", 00:06:16.952 "sock_impl_set_options", 00:06:16.952 "sock_impl_get_options", 00:06:16.952 "vmd_rescan", 00:06:16.952 "vmd_remove_device", 00:06:16.952 "vmd_enable", 00:06:16.952 "accel_get_stats", 00:06:16.952 "accel_set_options", 00:06:16.952 "accel_set_driver", 00:06:16.952 "accel_crypto_key_destroy", 00:06:16.952 "accel_crypto_keys_get", 00:06:16.952 "accel_crypto_key_create", 00:06:16.952 "accel_assign_opc", 00:06:16.952 "accel_get_module_info", 00:06:16.952 "accel_get_opc_assignments", 00:06:16.952 "bdev_get_histogram", 00:06:16.952 "bdev_enable_histogram", 00:06:16.952 "bdev_set_qos_limit", 00:06:16.952 "bdev_set_qd_sampling_period", 00:06:16.952 "bdev_get_bdevs", 00:06:16.952 "bdev_reset_iostat", 00:06:16.952 "bdev_get_iostat", 00:06:16.952 "bdev_examine", 00:06:16.952 "bdev_wait_for_examine", 00:06:16.952 "bdev_set_options", 00:06:16.952 "scsi_get_devices", 00:06:16.952 "thread_set_cpumask", 00:06:16.952 "scheduler_set_options", 00:06:16.952 "framework_get_governor", 00:06:16.952 "framework_get_scheduler", 00:06:16.952 "framework_set_scheduler", 00:06:16.952 "framework_get_reactors", 00:06:16.952 "thread_get_io_channels", 00:06:16.952 "thread_get_pollers", 00:06:16.952 "thread_get_stats", 00:06:16.952 "framework_monitor_context_switch", 00:06:16.952 "spdk_kill_instance", 00:06:16.952 "log_enable_timestamps", 00:06:16.952 "log_get_flags", 00:06:16.952 "log_clear_flag", 00:06:16.952 "log_set_flag", 00:06:16.952 "log_get_level", 00:06:16.952 "log_set_level", 00:06:16.952 "log_get_print_level", 00:06:16.952 "log_set_print_level", 00:06:16.952 "framework_enable_cpumask_locks", 00:06:16.952 "framework_disable_cpumask_locks", 00:06:16.952 "framework_wait_init", 00:06:16.952 "framework_start_init", 00:06:16.952 "virtio_blk_create_transport", 00:06:16.952 "virtio_blk_get_transports", 00:06:16.952 "vhost_controller_set_coalescing", 00:06:16.952 "vhost_get_controllers", 00:06:16.952 "vhost_delete_controller", 00:06:16.952 "vhost_create_blk_controller", 00:06:16.952 "vhost_scsi_controller_remove_target", 00:06:16.952 "vhost_scsi_controller_add_target", 00:06:16.952 "vhost_start_scsi_controller", 00:06:16.952 "vhost_create_scsi_controller", 00:06:16.952 "ublk_recover_disk", 00:06:16.952 "ublk_get_disks", 00:06:16.952 "ublk_stop_disk", 00:06:16.952 "ublk_start_disk", 00:06:16.952 "ublk_destroy_target", 00:06:16.952 "ublk_create_target", 00:06:16.952 "nbd_get_disks", 00:06:16.952 "nbd_stop_disk", 00:06:16.952 "nbd_start_disk", 00:06:16.952 "env_dpdk_get_mem_stats", 00:06:16.952 "nvmf_stop_mdns_prr", 00:06:16.952 "nvmf_publish_mdns_prr", 00:06:16.952 "nvmf_subsystem_get_listeners", 00:06:16.952 "nvmf_subsystem_get_qpairs", 00:06:16.952 "nvmf_subsystem_get_controllers", 00:06:16.952 "nvmf_get_stats", 00:06:16.952 "nvmf_get_transports", 00:06:16.952 "nvmf_create_transport", 00:06:16.952 "nvmf_get_targets", 00:06:16.952 "nvmf_delete_target", 00:06:16.952 "nvmf_create_target", 00:06:16.952 "nvmf_subsystem_allow_any_host", 00:06:16.952 "nvmf_subsystem_set_keys", 00:06:16.952 "nvmf_subsystem_remove_host", 00:06:16.952 "nvmf_subsystem_add_host", 00:06:16.952 "nvmf_ns_remove_host", 00:06:16.952 "nvmf_ns_add_host", 00:06:16.952 "nvmf_subsystem_remove_ns", 00:06:16.952 "nvmf_subsystem_set_ns_ana_group", 00:06:16.952 "nvmf_subsystem_add_ns", 00:06:16.952 "nvmf_subsystem_listener_set_ana_state", 00:06:16.952 "nvmf_discovery_get_referrals", 00:06:16.952 "nvmf_discovery_remove_referral", 00:06:16.952 "nvmf_discovery_add_referral", 00:06:16.952 "nvmf_subsystem_remove_listener", 00:06:16.952 "nvmf_subsystem_add_listener", 00:06:16.952 "nvmf_delete_subsystem", 00:06:16.952 "nvmf_create_subsystem", 00:06:16.952 "nvmf_get_subsystems", 00:06:16.952 "nvmf_set_crdt", 00:06:16.952 "nvmf_set_config", 00:06:16.952 "nvmf_set_max_subsystems", 00:06:16.952 "iscsi_get_histogram", 00:06:16.952 "iscsi_enable_histogram", 00:06:16.952 "iscsi_set_options", 00:06:16.952 "iscsi_get_auth_groups", 00:06:16.952 "iscsi_auth_group_remove_secret", 00:06:16.952 "iscsi_auth_group_add_secret", 00:06:16.952 "iscsi_delete_auth_group", 00:06:16.952 "iscsi_create_auth_group", 00:06:16.952 "iscsi_set_discovery_auth", 00:06:16.952 "iscsi_get_options", 00:06:16.952 "iscsi_target_node_request_logout", 00:06:16.952 "iscsi_target_node_set_redirect", 00:06:16.952 "iscsi_target_node_set_auth", 00:06:16.952 "iscsi_target_node_add_lun", 00:06:16.952 "iscsi_get_stats", 00:06:16.952 "iscsi_get_connections", 00:06:16.952 "iscsi_portal_group_set_auth", 00:06:16.952 "iscsi_start_portal_group", 00:06:16.952 "iscsi_delete_portal_group", 00:06:16.952 "iscsi_create_portal_group", 00:06:16.952 "iscsi_get_portal_groups", 00:06:16.952 "iscsi_delete_target_node", 00:06:16.952 "iscsi_target_node_remove_pg_ig_maps", 00:06:16.952 "iscsi_target_node_add_pg_ig_maps", 00:06:16.952 "iscsi_create_target_node", 00:06:16.952 "iscsi_get_target_nodes", 00:06:16.952 "iscsi_delete_initiator_group", 00:06:16.952 "iscsi_initiator_group_remove_initiators", 00:06:16.952 "iscsi_initiator_group_add_initiators", 00:06:16.952 "iscsi_create_initiator_group", 00:06:16.952 "iscsi_get_initiator_groups", 00:06:16.952 "fsdev_aio_delete", 00:06:16.952 "fsdev_aio_create", 00:06:16.952 "keyring_linux_set_options", 00:06:16.953 "keyring_file_remove_key", 00:06:16.953 "keyring_file_add_key", 00:06:16.953 "vfu_virtio_create_fs_endpoint", 00:06:16.953 "vfu_virtio_create_scsi_endpoint", 00:06:16.953 "vfu_virtio_scsi_remove_target", 00:06:16.953 "vfu_virtio_scsi_add_target", 00:06:16.953 "vfu_virtio_create_blk_endpoint", 00:06:16.953 "vfu_virtio_delete_endpoint", 00:06:16.953 "iaa_scan_accel_module", 00:06:16.953 "dsa_scan_accel_module", 00:06:16.953 "ioat_scan_accel_module", 00:06:16.953 "accel_error_inject_error", 00:06:16.953 "bdev_iscsi_delete", 00:06:16.953 "bdev_iscsi_create", 00:06:16.953 "bdev_iscsi_set_options", 00:06:16.953 "bdev_virtio_attach_controller", 00:06:16.953 "bdev_virtio_scsi_get_devices", 00:06:16.953 "bdev_virtio_detach_controller", 00:06:16.953 "bdev_virtio_blk_set_hotplug", 00:06:16.953 "bdev_ftl_set_property", 00:06:16.953 "bdev_ftl_get_properties", 00:06:16.953 "bdev_ftl_get_stats", 00:06:16.953 "bdev_ftl_unmap", 00:06:16.953 "bdev_ftl_unload", 00:06:16.953 "bdev_ftl_delete", 00:06:16.953 "bdev_ftl_load", 00:06:16.953 "bdev_ftl_create", 00:06:16.953 "bdev_aio_delete", 00:06:16.953 "bdev_aio_rescan", 00:06:16.953 "bdev_aio_create", 00:06:16.953 "blobfs_create", 00:06:16.953 "blobfs_detect", 00:06:16.953 "blobfs_set_cache_size", 00:06:16.953 "bdev_zone_block_delete", 00:06:16.953 "bdev_zone_block_create", 00:06:16.953 "bdev_delay_delete", 00:06:16.953 "bdev_delay_create", 00:06:16.953 "bdev_delay_update_latency", 00:06:16.953 "bdev_split_delete", 00:06:16.953 "bdev_split_create", 00:06:16.953 "bdev_error_inject_error", 00:06:16.953 "bdev_error_delete", 00:06:16.953 "bdev_error_create", 00:06:16.953 "bdev_raid_set_options", 00:06:16.953 "bdev_raid_remove_base_bdev", 00:06:16.953 "bdev_raid_add_base_bdev", 00:06:16.953 "bdev_raid_delete", 00:06:16.953 "bdev_raid_create", 00:06:16.953 "bdev_raid_get_bdevs", 00:06:16.953 "bdev_lvol_set_parent_bdev", 00:06:16.953 "bdev_lvol_set_parent", 00:06:16.953 "bdev_lvol_check_shallow_copy", 00:06:16.953 "bdev_lvol_start_shallow_copy", 00:06:16.953 "bdev_lvol_grow_lvstore", 00:06:16.953 "bdev_lvol_get_lvols", 00:06:16.953 "bdev_lvol_get_lvstores", 00:06:16.953 "bdev_lvol_delete", 00:06:16.953 "bdev_lvol_set_read_only", 00:06:16.953 "bdev_lvol_resize", 00:06:16.953 "bdev_lvol_decouple_parent", 00:06:16.953 "bdev_lvol_inflate", 00:06:16.953 "bdev_lvol_rename", 00:06:16.953 "bdev_lvol_clone_bdev", 00:06:16.953 "bdev_lvol_clone", 00:06:16.953 "bdev_lvol_snapshot", 00:06:16.953 "bdev_lvol_create", 00:06:16.953 "bdev_lvol_delete_lvstore", 00:06:16.953 "bdev_lvol_rename_lvstore", 00:06:16.953 "bdev_lvol_create_lvstore", 00:06:16.953 "bdev_passthru_delete", 00:06:16.953 "bdev_passthru_create", 00:06:16.953 "bdev_nvme_cuse_unregister", 00:06:16.953 "bdev_nvme_cuse_register", 00:06:16.953 "bdev_opal_new_user", 00:06:16.953 "bdev_opal_set_lock_state", 00:06:16.953 "bdev_opal_delete", 00:06:16.953 "bdev_opal_get_info", 00:06:16.953 "bdev_opal_create", 00:06:16.953 "bdev_nvme_opal_revert", 00:06:16.953 "bdev_nvme_opal_init", 00:06:16.953 "bdev_nvme_send_cmd", 00:06:16.953 "bdev_nvme_set_keys", 00:06:16.953 "bdev_nvme_get_path_iostat", 00:06:16.953 "bdev_nvme_get_mdns_discovery_info", 00:06:16.953 "bdev_nvme_stop_mdns_discovery", 00:06:16.953 "bdev_nvme_start_mdns_discovery", 00:06:16.953 "bdev_nvme_set_multipath_policy", 00:06:16.953 "bdev_nvme_set_preferred_path", 00:06:16.953 "bdev_nvme_get_io_paths", 00:06:16.953 "bdev_nvme_remove_error_injection", 00:06:16.953 "bdev_nvme_add_error_injection", 00:06:16.953 "bdev_nvme_get_discovery_info", 00:06:16.953 "bdev_nvme_stop_discovery", 00:06:16.953 "bdev_nvme_start_discovery", 00:06:16.953 "bdev_nvme_get_controller_health_info", 00:06:16.953 "bdev_nvme_disable_controller", 00:06:16.953 "bdev_nvme_enable_controller", 00:06:16.953 "bdev_nvme_reset_controller", 00:06:16.953 "bdev_nvme_get_transport_statistics", 00:06:16.953 "bdev_nvme_apply_firmware", 00:06:16.953 "bdev_nvme_detach_controller", 00:06:16.953 "bdev_nvme_get_controllers", 00:06:16.953 "bdev_nvme_attach_controller", 00:06:16.953 "bdev_nvme_set_hotplug", 00:06:16.953 "bdev_nvme_set_options", 00:06:16.953 "bdev_null_resize", 00:06:16.953 "bdev_null_delete", 00:06:16.953 "bdev_null_create", 00:06:16.953 "bdev_malloc_delete", 00:06:16.953 "bdev_malloc_create" 00:06:16.953 ] 00:06:16.953 15:43:24 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:16.953 15:43:24 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:16.953 15:43:24 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1699677 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 1699677 ']' 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 1699677 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1699677 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1699677' 00:06:16.953 killing process with pid 1699677 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 1699677 00:06:16.953 15:43:24 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 1699677 00:06:17.212 00:06:17.212 real 0m1.646s 00:06:17.212 user 0m2.832s 00:06:17.212 sys 0m0.544s 00:06:17.212 15:43:25 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:17.212 15:43:25 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:17.212 ************************************ 00:06:17.212 END TEST spdkcli_tcp 00:06:17.212 ************************************ 00:06:17.212 15:43:25 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:17.212 15:43:25 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:17.212 15:43:25 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:17.212 15:43:25 -- common/autotest_common.sh@10 -- # set +x 00:06:17.470 ************************************ 00:06:17.470 START TEST dpdk_mem_utility 00:06:17.470 ************************************ 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:17.470 * Looking for test storage... 00:06:17.470 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lcov --version 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:17.470 15:43:25 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:17.470 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:17.470 --rc genhtml_branch_coverage=1 00:06:17.470 --rc genhtml_function_coverage=1 00:06:17.470 --rc genhtml_legend=1 00:06:17.470 --rc geninfo_all_blocks=1 00:06:17.470 --rc geninfo_unexecuted_blocks=1 00:06:17.470 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:17.470 ' 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:17.470 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:17.470 --rc genhtml_branch_coverage=1 00:06:17.470 --rc genhtml_function_coverage=1 00:06:17.470 --rc genhtml_legend=1 00:06:17.470 --rc geninfo_all_blocks=1 00:06:17.470 --rc geninfo_unexecuted_blocks=1 00:06:17.470 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:17.470 ' 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:17.470 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:17.470 --rc genhtml_branch_coverage=1 00:06:17.470 --rc genhtml_function_coverage=1 00:06:17.470 --rc genhtml_legend=1 00:06:17.470 --rc geninfo_all_blocks=1 00:06:17.470 --rc geninfo_unexecuted_blocks=1 00:06:17.470 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:17.470 ' 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:17.470 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:17.470 --rc genhtml_branch_coverage=1 00:06:17.470 --rc genhtml_function_coverage=1 00:06:17.470 --rc genhtml_legend=1 00:06:17.470 --rc geninfo_all_blocks=1 00:06:17.470 --rc geninfo_unexecuted_blocks=1 00:06:17.470 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:17.470 ' 00:06:17.470 15:43:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:17.470 15:43:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1700024 00:06:17.470 15:43:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1700024 00:06:17.470 15:43:25 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 1700024 ']' 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:17.470 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:17.470 15:43:25 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:17.470 [2024-11-30 15:43:25.404755] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:17.470 [2024-11-30 15:43:25.404845] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1700024 ] 00:06:17.729 [2024-11-30 15:43:25.541199] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:17.729 [2024-11-30 15:43:25.572801] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.729 [2024-11-30 15:43:25.595165] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.297 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:18.297 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:06:18.297 15:43:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:18.297 15:43:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:18.297 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:18.297 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:18.557 { 00:06:18.557 "filename": "/tmp/spdk_mem_dump.txt" 00:06:18.557 } 00:06:18.557 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:18.557 15:43:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:18.557 DPDK memory size 818.000000 MiB in 1 heap(s) 00:06:18.557 1 heaps totaling size 818.000000 MiB 00:06:18.557 size: 818.000000 MiB heap id: 0 00:06:18.557 end heaps---------- 00:06:18.557 9 mempools totaling size 603.782043 MiB 00:06:18.557 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:18.557 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:18.557 size: 100.555481 MiB name: bdev_io_1700024 00:06:18.557 size: 50.003479 MiB name: msgpool_1700024 00:06:18.557 size: 36.509338 MiB name: fsdev_io_1700024 00:06:18.557 size: 21.763794 MiB name: PDU_Pool 00:06:18.557 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:18.557 size: 4.133484 MiB name: evtpool_1700024 00:06:18.557 size: 0.026123 MiB name: Session_Pool 00:06:18.557 end mempools------- 00:06:18.557 6 memzones totaling size 4.142822 MiB 00:06:18.557 size: 1.000366 MiB name: RG_ring_0_1700024 00:06:18.557 size: 1.000366 MiB name: RG_ring_1_1700024 00:06:18.557 size: 1.000366 MiB name: RG_ring_4_1700024 00:06:18.557 size: 1.000366 MiB name: RG_ring_5_1700024 00:06:18.557 size: 0.125366 MiB name: RG_ring_2_1700024 00:06:18.557 size: 0.015991 MiB name: RG_ring_3_1700024 00:06:18.557 end memzones------- 00:06:18.557 15:43:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:18.557 heap id: 0 total size: 818.000000 MiB number of busy elements: 43 number of free elements: 15 00:06:18.557 list of free elements. size: 10.993225 MiB 00:06:18.557 element at address: 0x200019200000 with size: 0.999878 MiB 00:06:18.557 element at address: 0x200019400000 with size: 0.999878 MiB 00:06:18.557 element at address: 0x200000400000 with size: 0.998535 MiB 00:06:18.557 element at address: 0x200032000000 with size: 0.994446 MiB 00:06:18.557 element at address: 0x200008000000 with size: 0.959839 MiB 00:06:18.557 element at address: 0x200012c00000 with size: 0.944275 MiB 00:06:18.557 element at address: 0x200019600000 with size: 0.936584 MiB 00:06:18.557 element at address: 0x200000200000 with size: 0.858093 MiB 00:06:18.557 element at address: 0x20001ae00000 with size: 0.582886 MiB 00:06:18.557 element at address: 0x200000c00000 with size: 0.495422 MiB 00:06:18.557 element at address: 0x200003e00000 with size: 0.490723 MiB 00:06:18.557 element at address: 0x200019800000 with size: 0.485657 MiB 00:06:18.557 element at address: 0x200010600000 with size: 0.481934 MiB 00:06:18.557 element at address: 0x200028200000 with size: 0.410034 MiB 00:06:18.557 element at address: 0x200000800000 with size: 0.355042 MiB 00:06:18.557 list of standard malloc elements. size: 199.077881 MiB 00:06:18.557 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:06:18.557 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:06:18.557 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:18.557 element at address: 0x2000194fff80 with size: 1.000122 MiB 00:06:18.557 element at address: 0x2000196fff80 with size: 1.000122 MiB 00:06:18.557 element at address: 0x2000196eff00 with size: 0.062622 MiB 00:06:18.557 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:18.557 element at address: 0x2000196efdc0 with size: 0.000305 MiB 00:06:18.557 element at address: 0x2000002fbcc0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000003fdec0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20000085b040 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20000085b100 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000008df880 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200000cff000 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20001067b600 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200012cf1bc0 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000196efc40 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000196efd00 with size: 0.000183 MiB 00:06:18.557 element at address: 0x2000198bc740 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20001ae95380 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20001ae95440 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200028268f80 with size: 0.000183 MiB 00:06:18.557 element at address: 0x200028269040 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20002826fc40 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20002826fe40 with size: 0.000183 MiB 00:06:18.557 element at address: 0x20002826ff00 with size: 0.000183 MiB 00:06:18.557 list of memzone associated elements. size: 607.928894 MiB 00:06:18.557 element at address: 0x20001ae95500 with size: 211.416748 MiB 00:06:18.557 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:18.557 element at address: 0x20002826ffc0 with size: 157.562561 MiB 00:06:18.557 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:18.557 element at address: 0x200012df1e80 with size: 100.055054 MiB 00:06:18.557 associated memzone info: size: 100.054932 MiB name: MP_bdev_io_1700024_0 00:06:18.557 element at address: 0x200000dff380 with size: 48.003052 MiB 00:06:18.557 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1700024_0 00:06:18.557 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:06:18.557 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_1700024_0 00:06:18.557 element at address: 0x2000199be940 with size: 20.255554 MiB 00:06:18.557 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:18.557 element at address: 0x2000321feb40 with size: 18.005066 MiB 00:06:18.557 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:18.557 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:06:18.557 associated memzone info: size: 3.000122 MiB name: MP_evtpool_1700024_0 00:06:18.557 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:06:18.557 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1700024 00:06:18.557 element at address: 0x2000002fbd80 with size: 1.008118 MiB 00:06:18.557 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1700024 00:06:18.558 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:06:18.558 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:18.558 element at address: 0x2000198bc800 with size: 1.008118 MiB 00:06:18.558 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:18.558 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:06:18.558 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:18.558 element at address: 0x200003efde40 with size: 1.008118 MiB 00:06:18.558 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:18.558 element at address: 0x200000cff180 with size: 1.000488 MiB 00:06:18.558 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1700024 00:06:18.558 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:06:18.558 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1700024 00:06:18.558 element at address: 0x200012cf1c80 with size: 1.000488 MiB 00:06:18.558 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1700024 00:06:18.558 element at address: 0x2000320fe940 with size: 1.000488 MiB 00:06:18.558 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1700024 00:06:18.558 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:06:18.558 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_1700024 00:06:18.558 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:06:18.558 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1700024 00:06:18.558 element at address: 0x20001067b780 with size: 0.500488 MiB 00:06:18.558 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:18.558 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:06:18.558 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:18.558 element at address: 0x20001987c540 with size: 0.250488 MiB 00:06:18.558 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:18.558 element at address: 0x2000002dbac0 with size: 0.125488 MiB 00:06:18.558 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_1700024 00:06:18.558 element at address: 0x2000008df940 with size: 0.125488 MiB 00:06:18.558 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1700024 00:06:18.558 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:06:18.558 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:18.558 element at address: 0x200028269100 with size: 0.023743 MiB 00:06:18.558 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:18.558 element at address: 0x2000008db680 with size: 0.016113 MiB 00:06:18.558 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1700024 00:06:18.558 element at address: 0x20002826f240 with size: 0.002441 MiB 00:06:18.558 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:18.558 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:06:18.558 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1700024 00:06:18.558 element at address: 0x2000008db480 with size: 0.000305 MiB 00:06:18.558 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_1700024 00:06:18.558 element at address: 0x20000085af00 with size: 0.000305 MiB 00:06:18.558 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1700024 00:06:18.558 element at address: 0x20002826fd00 with size: 0.000305 MiB 00:06:18.558 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:18.558 15:43:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:18.558 15:43:26 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1700024 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 1700024 ']' 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 1700024 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1700024 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1700024' 00:06:18.558 killing process with pid 1700024 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 1700024 00:06:18.558 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 1700024 00:06:18.817 00:06:18.817 real 0m1.527s 00:06:18.817 user 0m1.483s 00:06:18.817 sys 0m0.469s 00:06:18.817 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:18.817 15:43:26 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:18.817 ************************************ 00:06:18.817 END TEST dpdk_mem_utility 00:06:18.817 ************************************ 00:06:18.817 15:43:26 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:18.817 15:43:26 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:18.817 15:43:26 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:18.817 15:43:26 -- common/autotest_common.sh@10 -- # set +x 00:06:19.076 ************************************ 00:06:19.076 START TEST event 00:06:19.076 ************************************ 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:19.076 * Looking for test storage... 00:06:19.076 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1693 -- # lcov --version 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:19.076 15:43:26 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:19.076 15:43:26 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:19.076 15:43:26 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:19.076 15:43:26 event -- scripts/common.sh@336 -- # IFS=.-: 00:06:19.076 15:43:26 event -- scripts/common.sh@336 -- # read -ra ver1 00:06:19.076 15:43:26 event -- scripts/common.sh@337 -- # IFS=.-: 00:06:19.076 15:43:26 event -- scripts/common.sh@337 -- # read -ra ver2 00:06:19.076 15:43:26 event -- scripts/common.sh@338 -- # local 'op=<' 00:06:19.076 15:43:26 event -- scripts/common.sh@340 -- # ver1_l=2 00:06:19.076 15:43:26 event -- scripts/common.sh@341 -- # ver2_l=1 00:06:19.076 15:43:26 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:19.076 15:43:26 event -- scripts/common.sh@344 -- # case "$op" in 00:06:19.076 15:43:26 event -- scripts/common.sh@345 -- # : 1 00:06:19.076 15:43:26 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:19.076 15:43:26 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:19.076 15:43:26 event -- scripts/common.sh@365 -- # decimal 1 00:06:19.076 15:43:26 event -- scripts/common.sh@353 -- # local d=1 00:06:19.076 15:43:26 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:19.076 15:43:26 event -- scripts/common.sh@355 -- # echo 1 00:06:19.076 15:43:26 event -- scripts/common.sh@365 -- # ver1[v]=1 00:06:19.076 15:43:26 event -- scripts/common.sh@366 -- # decimal 2 00:06:19.076 15:43:26 event -- scripts/common.sh@353 -- # local d=2 00:06:19.076 15:43:26 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:19.076 15:43:26 event -- scripts/common.sh@355 -- # echo 2 00:06:19.076 15:43:26 event -- scripts/common.sh@366 -- # ver2[v]=2 00:06:19.076 15:43:26 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:19.076 15:43:26 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:19.076 15:43:26 event -- scripts/common.sh@368 -- # return 0 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:19.076 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:19.076 --rc genhtml_branch_coverage=1 00:06:19.076 --rc genhtml_function_coverage=1 00:06:19.076 --rc genhtml_legend=1 00:06:19.076 --rc geninfo_all_blocks=1 00:06:19.076 --rc geninfo_unexecuted_blocks=1 00:06:19.076 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:19.076 ' 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:19.076 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:19.076 --rc genhtml_branch_coverage=1 00:06:19.076 --rc genhtml_function_coverage=1 00:06:19.076 --rc genhtml_legend=1 00:06:19.076 --rc geninfo_all_blocks=1 00:06:19.076 --rc geninfo_unexecuted_blocks=1 00:06:19.076 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:19.076 ' 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:19.076 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:19.076 --rc genhtml_branch_coverage=1 00:06:19.076 --rc genhtml_function_coverage=1 00:06:19.076 --rc genhtml_legend=1 00:06:19.076 --rc geninfo_all_blocks=1 00:06:19.076 --rc geninfo_unexecuted_blocks=1 00:06:19.076 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:19.076 ' 00:06:19.076 15:43:26 event -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:19.076 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:19.076 --rc genhtml_branch_coverage=1 00:06:19.076 --rc genhtml_function_coverage=1 00:06:19.076 --rc genhtml_legend=1 00:06:19.076 --rc geninfo_all_blocks=1 00:06:19.076 --rc geninfo_unexecuted_blocks=1 00:06:19.076 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:19.076 ' 00:06:19.076 15:43:26 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:19.076 15:43:26 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:19.077 15:43:26 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:19.077 15:43:26 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:06:19.077 15:43:26 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:19.077 15:43:26 event -- common/autotest_common.sh@10 -- # set +x 00:06:19.077 ************************************ 00:06:19.077 START TEST event_perf 00:06:19.077 ************************************ 00:06:19.077 15:43:26 event.event_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:19.077 Running I/O for 1 seconds...[2024-11-30 15:43:27.017531] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:19.077 [2024-11-30 15:43:27.017623] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1700359 ] 00:06:19.338 [2024-11-30 15:43:27.155984] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:19.338 [2024-11-30 15:43:27.190493] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:19.338 [2024-11-30 15:43:27.218806] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:19.338 [2024-11-30 15:43:27.218899] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:06:19.338 [2024-11-30 15:43:27.218917] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:06:19.338 [2024-11-30 15:43:27.218918] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.714 Running I/O for 1 seconds... 00:06:20.714 lcore 0: 181577 00:06:20.714 lcore 1: 181577 00:06:20.714 lcore 2: 181578 00:06:20.714 lcore 3: 181577 00:06:20.714 done. 00:06:20.714 00:06:20.714 real 0m1.250s 00:06:20.714 user 0m4.051s 00:06:20.714 sys 0m0.088s 00:06:20.714 15:43:28 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:20.714 15:43:28 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:20.714 ************************************ 00:06:20.714 END TEST event_perf 00:06:20.714 ************************************ 00:06:20.714 15:43:28 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:20.714 15:43:28 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:06:20.714 15:43:28 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:20.714 15:43:28 event -- common/autotest_common.sh@10 -- # set +x 00:06:20.714 ************************************ 00:06:20.714 START TEST event_reactor 00:06:20.714 ************************************ 00:06:20.714 15:43:28 event.event_reactor -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:20.714 [2024-11-30 15:43:28.351803] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:20.714 [2024-11-30 15:43:28.351886] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1700650 ] 00:06:20.714 [2024-11-30 15:43:28.491871] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:20.714 [2024-11-30 15:43:28.526662] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.714 [2024-11-30 15:43:28.550440] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.649 test_start 00:06:21.649 oneshot 00:06:21.649 tick 100 00:06:21.649 tick 100 00:06:21.649 tick 250 00:06:21.649 tick 100 00:06:21.649 tick 100 00:06:21.649 tick 100 00:06:21.649 tick 250 00:06:21.649 tick 500 00:06:21.649 tick 100 00:06:21.649 tick 100 00:06:21.649 tick 250 00:06:21.649 tick 100 00:06:21.649 tick 100 00:06:21.649 test_end 00:06:21.649 00:06:21.649 real 0m1.245s 00:06:21.649 user 0m1.057s 00:06:21.649 sys 0m0.084s 00:06:21.649 15:43:29 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:21.649 15:43:29 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:21.649 ************************************ 00:06:21.649 END TEST event_reactor 00:06:21.649 ************************************ 00:06:21.908 15:43:29 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:21.908 15:43:29 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:06:21.908 15:43:29 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:21.908 15:43:29 event -- common/autotest_common.sh@10 -- # set +x 00:06:21.908 ************************************ 00:06:21.908 START TEST event_reactor_perf 00:06:21.908 ************************************ 00:06:21.908 15:43:29 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:21.908 [2024-11-30 15:43:29.681855] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:21.908 [2024-11-30 15:43:29.681964] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1700845 ] 00:06:21.908 [2024-11-30 15:43:29.822457] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:21.908 [2024-11-30 15:43:29.856776] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.166 [2024-11-30 15:43:29.880442] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.101 test_start 00:06:23.101 test_end 00:06:23.101 Performance: 959620 events per second 00:06:23.101 00:06:23.101 real 0m1.250s 00:06:23.101 user 0m1.065s 00:06:23.101 sys 0m0.081s 00:06:23.101 15:43:30 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:23.101 15:43:30 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:23.101 ************************************ 00:06:23.101 END TEST event_reactor_perf 00:06:23.101 ************************************ 00:06:23.101 15:43:30 event -- event/event.sh@49 -- # uname -s 00:06:23.101 15:43:30 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:23.101 15:43:30 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:23.101 15:43:30 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:23.101 15:43:30 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:23.101 15:43:30 event -- common/autotest_common.sh@10 -- # set +x 00:06:23.101 ************************************ 00:06:23.101 START TEST event_scheduler 00:06:23.101 ************************************ 00:06:23.101 15:43:31 event.event_scheduler -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:23.361 * Looking for test storage... 00:06:23.361 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1693 -- # lcov --version 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:23.361 15:43:31 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:23.361 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.361 --rc genhtml_branch_coverage=1 00:06:23.361 --rc genhtml_function_coverage=1 00:06:23.361 --rc genhtml_legend=1 00:06:23.361 --rc geninfo_all_blocks=1 00:06:23.361 --rc geninfo_unexecuted_blocks=1 00:06:23.361 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.361 ' 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:23.361 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.361 --rc genhtml_branch_coverage=1 00:06:23.361 --rc genhtml_function_coverage=1 00:06:23.361 --rc genhtml_legend=1 00:06:23.361 --rc geninfo_all_blocks=1 00:06:23.361 --rc geninfo_unexecuted_blocks=1 00:06:23.361 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.361 ' 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:23.361 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.361 --rc genhtml_branch_coverage=1 00:06:23.361 --rc genhtml_function_coverage=1 00:06:23.361 --rc genhtml_legend=1 00:06:23.361 --rc geninfo_all_blocks=1 00:06:23.361 --rc geninfo_unexecuted_blocks=1 00:06:23.361 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.361 ' 00:06:23.361 15:43:31 event.event_scheduler -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:23.361 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.362 --rc genhtml_branch_coverage=1 00:06:23.362 --rc genhtml_function_coverage=1 00:06:23.362 --rc genhtml_legend=1 00:06:23.362 --rc geninfo_all_blocks=1 00:06:23.362 --rc geninfo_unexecuted_blocks=1 00:06:23.362 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.362 ' 00:06:23.362 15:43:31 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:23.362 15:43:31 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1701175 00:06:23.362 15:43:31 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:23.362 15:43:31 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:23.362 15:43:31 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1701175 00:06:23.362 15:43:31 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 1701175 ']' 00:06:23.362 15:43:31 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.362 15:43:31 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:23.362 15:43:31 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.362 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.362 15:43:31 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:23.362 15:43:31 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:23.362 [2024-11-30 15:43:31.223020] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:23.362 [2024-11-30 15:43:31.223090] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1701175 ] 00:06:23.622 [2024-11-30 15:43:31.359651] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:23.622 [2024-11-30 15:43:31.390767] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:23.622 [2024-11-30 15:43:31.416476] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.622 [2024-11-30 15:43:31.416561] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.622 [2024-11-30 15:43:31.416643] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:06:23.622 [2024-11-30 15:43:31.416645] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:06:24.190 15:43:32 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:24.190 15:43:32 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:06:24.190 15:43:32 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:24.190 15:43:32 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.190 15:43:32 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:24.190 [2024-11-30 15:43:32.089431] dpdk_governor.c: 178:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:24.190 [2024-11-30 15:43:32.089453] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:06:24.190 [2024-11-30 15:43:32.089464] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:24.190 [2024-11-30 15:43:32.089471] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:24.190 [2024-11-30 15:43:32.089478] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:24.190 15:43:32 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.190 15:43:32 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:24.190 15:43:32 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.190 15:43:32 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 [2024-11-30 15:43:32.158817] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:24.449 15:43:32 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.449 15:43:32 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:24.449 15:43:32 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:24.449 15:43:32 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:24.449 15:43:32 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 ************************************ 00:06:24.449 START TEST scheduler_create_thread 00:06:24.449 ************************************ 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 2 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 3 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 4 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 5 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 6 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 7 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.449 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.449 8 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.450 9 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.450 10 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:24.450 15:43:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:25.388 15:43:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:25.388 15:43:33 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:25.388 15:43:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:25.388 15:43:33 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:26.765 15:43:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:26.765 15:43:34 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:26.765 15:43:34 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:26.765 15:43:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:26.765 15:43:34 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:27.702 15:43:35 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:27.702 00:06:27.702 real 0m3.372s 00:06:27.702 user 0m0.027s 00:06:27.702 sys 0m0.005s 00:06:27.702 15:43:35 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:27.702 15:43:35 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:27.702 ************************************ 00:06:27.702 END TEST scheduler_create_thread 00:06:27.702 ************************************ 00:06:27.702 15:43:35 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:27.702 15:43:35 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1701175 00:06:27.702 15:43:35 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 1701175 ']' 00:06:27.702 15:43:35 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 1701175 00:06:27.702 15:43:35 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:06:27.702 15:43:35 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:27.702 15:43:35 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1701175 00:06:27.961 15:43:35 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:06:27.961 15:43:35 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:06:27.961 15:43:35 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1701175' 00:06:27.961 killing process with pid 1701175 00:06:27.961 15:43:35 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 1701175 00:06:27.961 15:43:35 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 1701175 00:06:28.221 [2024-11-30 15:43:35.953729] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:28.221 00:06:28.221 real 0m5.147s 00:06:28.221 user 0m10.362s 00:06:28.221 sys 0m0.480s 00:06:28.221 15:43:36 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:28.221 15:43:36 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:28.221 ************************************ 00:06:28.221 END TEST event_scheduler 00:06:28.221 ************************************ 00:06:28.480 15:43:36 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:28.480 15:43:36 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:28.480 15:43:36 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:28.480 15:43:36 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:28.480 15:43:36 event -- common/autotest_common.sh@10 -- # set +x 00:06:28.480 ************************************ 00:06:28.480 START TEST app_repeat 00:06:28.480 ************************************ 00:06:28.480 15:43:36 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1702108 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1702108' 00:06:28.480 Process app_repeat pid: 1702108 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:28.480 spdk_app_start Round 0 00:06:28.480 15:43:36 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1702108 /var/tmp/spdk-nbd.sock 00:06:28.480 15:43:36 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1702108 ']' 00:06:28.480 15:43:36 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:28.480 15:43:36 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:28.480 15:43:36 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:28.480 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:28.480 15:43:36 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:28.480 15:43:36 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:28.480 [2024-11-30 15:43:36.270721] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:28.480 [2024-11-30 15:43:36.270801] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1702108 ] 00:06:28.480 [2024-11-30 15:43:36.410232] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:28.739 [2024-11-30 15:43:36.446017] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:28.739 [2024-11-30 15:43:36.469020] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:28.739 [2024-11-30 15:43:36.469022] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.306 15:43:37 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:29.306 15:43:37 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:06:29.306 15:43:37 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:29.565 Malloc0 00:06:29.565 15:43:37 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:29.565 Malloc1 00:06:29.824 15:43:37 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:29.824 15:43:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:29.824 /dev/nbd0 00:06:29.825 15:43:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:29.825 15:43:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:29.825 1+0 records in 00:06:29.825 1+0 records out 00:06:29.825 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000120707 s, 33.9 MB/s 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:29.825 15:43:37 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:06:30.084 15:43:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:30.084 15:43:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.084 15:43:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:30.084 /dev/nbd1 00:06:30.084 15:43:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:30.084 15:43:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:30.084 1+0 records in 00:06:30.084 1+0 records out 00:06:30.084 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000207639 s, 19.7 MB/s 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:30.084 15:43:38 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:06:30.084 15:43:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:30.084 15:43:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.084 15:43:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:30.084 15:43:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.084 15:43:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:30.343 { 00:06:30.343 "nbd_device": "/dev/nbd0", 00:06:30.343 "bdev_name": "Malloc0" 00:06:30.343 }, 00:06:30.343 { 00:06:30.343 "nbd_device": "/dev/nbd1", 00:06:30.343 "bdev_name": "Malloc1" 00:06:30.343 } 00:06:30.343 ]' 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:30.343 { 00:06:30.343 "nbd_device": "/dev/nbd0", 00:06:30.343 "bdev_name": "Malloc0" 00:06:30.343 }, 00:06:30.343 { 00:06:30.343 "nbd_device": "/dev/nbd1", 00:06:30.343 "bdev_name": "Malloc1" 00:06:30.343 } 00:06:30.343 ]' 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:30.343 /dev/nbd1' 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:30.343 /dev/nbd1' 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:30.343 256+0 records in 00:06:30.343 256+0 records out 00:06:30.343 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112081 s, 93.6 MB/s 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:30.343 256+0 records in 00:06:30.343 256+0 records out 00:06:30.343 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195007 s, 53.8 MB/s 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:30.343 15:43:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:30.602 256+0 records in 00:06:30.602 256+0 records out 00:06:30.602 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0215966 s, 48.6 MB/s 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:30.602 15:43:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.862 15:43:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:31.122 15:43:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:31.122 15:43:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:31.122 15:43:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:31.122 15:43:39 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:31.122 15:43:39 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:31.381 15:43:39 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:31.640 [2024-11-30 15:43:39.389800] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:31.640 [2024-11-30 15:43:39.410216] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:31.640 [2024-11-30 15:43:39.410218] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.640 [2024-11-30 15:43:39.449632] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:31.640 [2024-11-30 15:43:39.449680] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:34.930 15:43:42 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:34.930 15:43:42 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:34.930 spdk_app_start Round 1 00:06:34.930 15:43:42 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1702108 /var/tmp/spdk-nbd.sock 00:06:34.930 15:43:42 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1702108 ']' 00:06:34.930 15:43:42 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:34.930 15:43:42 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:34.930 15:43:42 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:34.930 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:34.930 15:43:42 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:34.930 15:43:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:34.930 15:43:42 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:34.930 15:43:42 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:06:34.930 15:43:42 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:34.930 Malloc0 00:06:34.930 15:43:42 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:34.930 Malloc1 00:06:34.930 15:43:42 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:34.930 15:43:42 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:35.189 /dev/nbd0 00:06:35.189 15:43:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:35.189 15:43:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:35.189 1+0 records in 00:06:35.189 1+0 records out 00:06:35.189 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234974 s, 17.4 MB/s 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:35.189 15:43:43 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:06:35.189 15:43:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:35.189 15:43:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:35.189 15:43:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:35.448 /dev/nbd1 00:06:35.448 15:43:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:35.448 15:43:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:35.448 1+0 records in 00:06:35.448 1+0 records out 00:06:35.448 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252461 s, 16.2 MB/s 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:35.448 15:43:43 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:06:35.449 15:43:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:35.449 15:43:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:35.449 15:43:43 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:35.449 15:43:43 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.449 15:43:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:35.707 15:43:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:35.707 { 00:06:35.708 "nbd_device": "/dev/nbd0", 00:06:35.708 "bdev_name": "Malloc0" 00:06:35.708 }, 00:06:35.708 { 00:06:35.708 "nbd_device": "/dev/nbd1", 00:06:35.708 "bdev_name": "Malloc1" 00:06:35.708 } 00:06:35.708 ]' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:35.708 { 00:06:35.708 "nbd_device": "/dev/nbd0", 00:06:35.708 "bdev_name": "Malloc0" 00:06:35.708 }, 00:06:35.708 { 00:06:35.708 "nbd_device": "/dev/nbd1", 00:06:35.708 "bdev_name": "Malloc1" 00:06:35.708 } 00:06:35.708 ]' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:35.708 /dev/nbd1' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:35.708 /dev/nbd1' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:35.708 256+0 records in 00:06:35.708 256+0 records out 00:06:35.708 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110348 s, 95.0 MB/s 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:35.708 256+0 records in 00:06:35.708 256+0 records out 00:06:35.708 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0192967 s, 54.3 MB/s 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:35.708 256+0 records in 00:06:35.708 256+0 records out 00:06:35.708 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.021286 s, 49.3 MB/s 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:35.708 15:43:43 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:35.967 15:43:43 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.226 15:43:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:36.486 15:43:44 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:36.486 15:43:44 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:36.746 15:43:44 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:36.746 [2024-11-30 15:43:44.677240] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:36.746 [2024-11-30 15:43:44.696999] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:36.746 [2024-11-30 15:43:44.697000] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.006 [2024-11-30 15:43:44.737435] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:37.006 [2024-11-30 15:43:44.737480] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:40.294 15:43:47 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:40.294 15:43:47 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:40.294 spdk_app_start Round 2 00:06:40.294 15:43:47 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1702108 /var/tmp/spdk-nbd.sock 00:06:40.294 15:43:47 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1702108 ']' 00:06:40.294 15:43:47 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:40.294 15:43:47 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:40.294 15:43:47 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:40.294 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:40.294 15:43:47 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:40.294 15:43:47 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:40.294 15:43:47 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:40.294 15:43:47 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:06:40.294 15:43:47 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:40.294 Malloc0 00:06:40.294 15:43:47 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:40.294 Malloc1 00:06:40.294 15:43:48 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:40.294 15:43:48 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:40.553 /dev/nbd0 00:06:40.553 15:43:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:40.553 15:43:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:40.553 1+0 records in 00:06:40.553 1+0 records out 00:06:40.553 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229533 s, 17.8 MB/s 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:40.553 15:43:48 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:06:40.553 15:43:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:40.553 15:43:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:40.553 15:43:48 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:40.812 /dev/nbd1 00:06:40.812 15:43:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:40.812 15:43:48 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:40.812 1+0 records in 00:06:40.812 1+0 records out 00:06:40.812 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000201472 s, 20.3 MB/s 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:40.812 15:43:48 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:06:40.812 15:43:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:40.812 15:43:48 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:40.812 15:43:48 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:40.812 15:43:48 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:40.812 15:43:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:41.072 { 00:06:41.072 "nbd_device": "/dev/nbd0", 00:06:41.072 "bdev_name": "Malloc0" 00:06:41.072 }, 00:06:41.072 { 00:06:41.072 "nbd_device": "/dev/nbd1", 00:06:41.072 "bdev_name": "Malloc1" 00:06:41.072 } 00:06:41.072 ]' 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:41.072 { 00:06:41.072 "nbd_device": "/dev/nbd0", 00:06:41.072 "bdev_name": "Malloc0" 00:06:41.072 }, 00:06:41.072 { 00:06:41.072 "nbd_device": "/dev/nbd1", 00:06:41.072 "bdev_name": "Malloc1" 00:06:41.072 } 00:06:41.072 ]' 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:41.072 /dev/nbd1' 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:41.072 /dev/nbd1' 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:41.072 256+0 records in 00:06:41.072 256+0 records out 00:06:41.072 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011013 s, 95.2 MB/s 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:41.072 256+0 records in 00:06:41.072 256+0 records out 00:06:41.072 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0194913 s, 53.8 MB/s 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:41.072 15:43:48 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:41.072 256+0 records in 00:06:41.072 256+0 records out 00:06:41.072 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213739 s, 49.1 MB/s 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:41.073 15:43:48 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:41.332 15:43:49 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.591 15:43:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:41.850 15:43:49 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:41.850 15:43:49 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:42.109 15:43:49 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:42.109 [2024-11-30 15:43:49.971839] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:42.109 [2024-11-30 15:43:49.991535] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:42.109 [2024-11-30 15:43:49.991537] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.109 [2024-11-30 15:43:50.031267] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:42.109 [2024-11-30 15:43:50.031312] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:45.398 15:43:52 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1702108 /var/tmp/spdk-nbd.sock 00:06:45.398 15:43:52 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 1702108 ']' 00:06:45.398 15:43:52 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:45.398 15:43:52 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:45.398 15:43:52 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:45.398 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:45.398 15:43:52 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:45.398 15:43:52 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:06:45.398 15:43:53 event.app_repeat -- event/event.sh@39 -- # killprocess 1702108 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 1702108 ']' 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 1702108 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1702108 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1702108' 00:06:45.398 killing process with pid 1702108 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@973 -- # kill 1702108 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@978 -- # wait 1702108 00:06:45.398 spdk_app_start is called in Round 0. 00:06:45.398 Shutdown signal received, stop current app iteration 00:06:45.398 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 reinitialization... 00:06:45.398 spdk_app_start is called in Round 1. 00:06:45.398 Shutdown signal received, stop current app iteration 00:06:45.398 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 reinitialization... 00:06:45.398 spdk_app_start is called in Round 2. 00:06:45.398 Shutdown signal received, stop current app iteration 00:06:45.398 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 reinitialization... 00:06:45.398 spdk_app_start is called in Round 3. 00:06:45.398 Shutdown signal received, stop current app iteration 00:06:45.398 15:43:53 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:45.398 15:43:53 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:45.398 00:06:45.398 real 0m16.977s 00:06:45.398 user 0m36.552s 00:06:45.398 sys 0m3.195s 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:45.398 15:43:53 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:45.398 ************************************ 00:06:45.398 END TEST app_repeat 00:06:45.398 ************************************ 00:06:45.398 15:43:53 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:45.398 15:43:53 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:45.398 15:43:53 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:45.398 15:43:53 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:45.398 15:43:53 event -- common/autotest_common.sh@10 -- # set +x 00:06:45.398 ************************************ 00:06:45.398 START TEST cpu_locks 00:06:45.398 ************************************ 00:06:45.398 15:43:53 event.cpu_locks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:45.657 * Looking for test storage... 00:06:45.657 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:45.657 15:43:53 event.cpu_locks -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:45.657 15:43:53 event.cpu_locks -- common/autotest_common.sh@1693 -- # lcov --version 00:06:45.657 15:43:53 event.cpu_locks -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:45.657 15:43:53 event.cpu_locks -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:45.657 15:43:53 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:45.657 15:43:53 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:45.657 15:43:53 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:45.657 15:43:53 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:06:45.657 15:43:53 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:06:45.657 15:43:53 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:45.658 15:43:53 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:06:45.658 15:43:53 event.cpu_locks -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:45.658 15:43:53 event.cpu_locks -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:45.658 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.658 --rc genhtml_branch_coverage=1 00:06:45.658 --rc genhtml_function_coverage=1 00:06:45.658 --rc genhtml_legend=1 00:06:45.658 --rc geninfo_all_blocks=1 00:06:45.658 --rc geninfo_unexecuted_blocks=1 00:06:45.658 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.658 ' 00:06:45.658 15:43:53 event.cpu_locks -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:45.658 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.658 --rc genhtml_branch_coverage=1 00:06:45.658 --rc genhtml_function_coverage=1 00:06:45.658 --rc genhtml_legend=1 00:06:45.658 --rc geninfo_all_blocks=1 00:06:45.658 --rc geninfo_unexecuted_blocks=1 00:06:45.658 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.658 ' 00:06:45.658 15:43:53 event.cpu_locks -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:45.658 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.658 --rc genhtml_branch_coverage=1 00:06:45.658 --rc genhtml_function_coverage=1 00:06:45.658 --rc genhtml_legend=1 00:06:45.658 --rc geninfo_all_blocks=1 00:06:45.658 --rc geninfo_unexecuted_blocks=1 00:06:45.658 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.658 ' 00:06:45.658 15:43:53 event.cpu_locks -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:45.658 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.658 --rc genhtml_branch_coverage=1 00:06:45.658 --rc genhtml_function_coverage=1 00:06:45.658 --rc genhtml_legend=1 00:06:45.658 --rc geninfo_all_blocks=1 00:06:45.658 --rc geninfo_unexecuted_blocks=1 00:06:45.658 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.658 ' 00:06:45.658 15:43:53 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:45.658 15:43:53 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:45.658 15:43:53 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:45.658 15:43:53 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:45.658 15:43:53 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:45.658 15:43:53 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:45.658 15:43:53 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:45.658 ************************************ 00:06:45.658 START TEST default_locks 00:06:45.658 ************************************ 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1705280 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 1705280 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 1705280 ']' 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:45.658 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:45.658 15:43:53 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:06:45.658 [2024-11-30 15:43:53.539097] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:45.658 [2024-11-30 15:43:53.539153] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1705280 ] 00:06:45.917 [2024-11-30 15:43:53.674398] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:45.917 [2024-11-30 15:43:53.708887] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.917 [2024-11-30 15:43:53.731398] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.482 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:46.482 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:06:46.482 15:43:54 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 1705280 00:06:46.482 15:43:54 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 1705280 00:06:46.482 15:43:54 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:47.047 lslocks: write error 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 1705280 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 1705280 ']' 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 1705280 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1705280 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1705280' 00:06:47.047 killing process with pid 1705280 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 1705280 00:06:47.047 15:43:54 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 1705280 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1705280 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1705280 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 1705280 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 1705280 ']' 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:47.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:06:47.615 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1705280) - No such process 00:06:47.615 ERROR: process (pid: 1705280) is no longer running 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:47.615 00:06:47.615 real 0m1.757s 00:06:47.615 user 0m1.773s 00:06:47.615 sys 0m0.613s 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.615 15:43:55 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:06:47.615 ************************************ 00:06:47.615 END TEST default_locks 00:06:47.615 ************************************ 00:06:47.615 15:43:55 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:47.615 15:43:55 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.615 15:43:55 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.615 15:43:55 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:47.615 ************************************ 00:06:47.615 START TEST default_locks_via_rpc 00:06:47.615 ************************************ 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1705590 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 1705590 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1705590 ']' 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:47.615 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:47.615 15:43:55 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:47.615 [2024-11-30 15:43:55.374170] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:47.615 [2024-11-30 15:43:55.374249] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1705590 ] 00:06:47.615 [2024-11-30 15:43:55.510316] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:47.615 [2024-11-30 15:43:55.546152] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.615 [2024-11-30 15:43:55.568318] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 1705590 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 1705590 00:06:48.551 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:48.809 15:43:56 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 1705590 00:06:48.809 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 1705590 ']' 00:06:48.809 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 1705590 00:06:48.809 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:06:48.809 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:48.809 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1705590 00:06:49.068 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:49.068 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:49.068 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1705590' 00:06:49.068 killing process with pid 1705590 00:06:49.069 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 1705590 00:06:49.069 15:43:56 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 1705590 00:06:49.328 00:06:49.328 real 0m1.731s 00:06:49.328 user 0m1.741s 00:06:49.328 sys 0m0.596s 00:06:49.328 15:43:57 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:49.328 15:43:57 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.328 ************************************ 00:06:49.328 END TEST default_locks_via_rpc 00:06:49.328 ************************************ 00:06:49.328 15:43:57 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:49.328 15:43:57 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:49.328 15:43:57 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:49.328 15:43:57 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:49.328 ************************************ 00:06:49.328 START TEST non_locking_app_on_locked_coremask 00:06:49.328 ************************************ 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1705895 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 1705895 /var/tmp/spdk.sock 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1705895 ']' 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:49.328 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:49.328 15:43:57 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:49.328 [2024-11-30 15:43:57.178147] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:49.328 [2024-11-30 15:43:57.178211] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1705895 ] 00:06:49.593 [2024-11-30 15:43:57.315025] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:49.593 [2024-11-30 15:43:57.349346] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.593 [2024-11-30 15:43:57.371927] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1706146 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 1706146 /var/tmp/spdk2.sock 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1706146 ']' 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:50.161 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:50.161 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:50.161 [2024-11-30 15:43:58.045240] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:50.161 [2024-11-30 15:43:58.045302] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1706146 ] 00:06:50.420 [2024-11-30 15:43:58.183084] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:50.420 [2024-11-30 15:43:58.241196] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:50.420 [2024-11-30 15:43:58.241218] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.420 [2024-11-30 15:43:58.287157] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.988 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:50.988 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:06:50.988 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 1705895 00:06:50.988 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1705895 00:06:50.988 15:43:58 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:52.368 lslocks: write error 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 1705895 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1705895 ']' 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1705895 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1705895 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1705895' 00:06:52.368 killing process with pid 1705895 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1705895 00:06:52.368 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1705895 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 1706146 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1706146 ']' 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1706146 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1706146 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1706146' 00:06:53.030 killing process with pid 1706146 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1706146 00:06:53.030 15:44:00 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1706146 00:06:53.290 00:06:53.290 real 0m3.968s 00:06:53.290 user 0m4.170s 00:06:53.290 sys 0m1.320s 00:06:53.290 15:44:01 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.290 15:44:01 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:53.290 ************************************ 00:06:53.290 END TEST non_locking_app_on_locked_coremask 00:06:53.290 ************************************ 00:06:53.290 15:44:01 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:53.290 15:44:01 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.290 15:44:01 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.290 15:44:01 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:53.290 ************************************ 00:06:53.290 START TEST locking_app_on_unlocked_coremask 00:06:53.290 ************************************ 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1706718 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 1706718 /var/tmp/spdk.sock 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1706718 ']' 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.290 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:53.290 15:44:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:53.290 [2024-11-30 15:44:01.222711] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:53.290 [2024-11-30 15:44:01.222781] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1706718 ] 00:06:53.550 [2024-11-30 15:44:01.357571] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:53.550 [2024-11-30 15:44:01.392566] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:53.550 [2024-11-30 15:44:01.392588] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.550 [2024-11-30 15:44:01.413900] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1706892 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 1706892 /var/tmp/spdk2.sock 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1706892 ']' 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:54.118 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:54.118 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:54.377 [2024-11-30 15:44:02.103406] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:54.377 [2024-11-30 15:44:02.103472] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1706892 ] 00:06:54.377 [2024-11-30 15:44:02.238974] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:54.377 [2024-11-30 15:44:02.302325] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.636 [2024-11-30 15:44:02.344619] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.206 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:55.206 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:06:55.206 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 1706892 00:06:55.206 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1706892 00:06:55.206 15:44:02 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:55.774 lslocks: write error 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 1706718 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1706718 ']' 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 1706718 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1706718 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1706718' 00:06:55.774 killing process with pid 1706718 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 1706718 00:06:55.774 15:44:03 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 1706718 00:06:56.344 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 1706892 00:06:56.344 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1706892 ']' 00:06:56.344 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 1706892 00:06:56.344 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:06:56.344 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:56.344 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1706892 00:06:56.604 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:56.604 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:56.604 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1706892' 00:06:56.604 killing process with pid 1706892 00:06:56.604 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 1706892 00:06:56.604 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 1706892 00:06:56.864 00:06:56.864 real 0m3.403s 00:06:56.864 user 0m3.594s 00:06:56.864 sys 0m1.112s 00:06:56.864 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:56.864 15:44:04 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:56.864 ************************************ 00:06:56.864 END TEST locking_app_on_unlocked_coremask 00:06:56.864 ************************************ 00:06:56.864 15:44:04 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:56.864 15:44:04 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:56.864 15:44:04 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:56.864 15:44:04 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:56.864 ************************************ 00:06:56.864 START TEST locking_app_on_locked_coremask 00:06:56.864 ************************************ 00:06:56.864 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:06:56.864 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1707303 00:06:56.864 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 1707303 /var/tmp/spdk.sock 00:06:56.864 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:56.865 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1707303 ']' 00:06:56.865 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:56.865 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:56.865 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:56.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:56.865 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:56.865 15:44:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:56.865 [2024-11-30 15:44:04.708006] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:56.865 [2024-11-30 15:44:04.708067] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707303 ] 00:06:57.124 [2024-11-30 15:44:04.843101] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:57.124 [2024-11-30 15:44:04.878732] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.124 [2024-11-30 15:44:04.899002] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1707567 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1707567 /var/tmp/spdk2.sock 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1707567 /var/tmp/spdk2.sock 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 1707567 /var/tmp/spdk2.sock 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 1707567 ']' 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:57.691 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:57.692 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:57.692 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:57.692 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:57.692 15:44:05 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:57.692 [2024-11-30 15:44:05.586582] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:57.692 [2024-11-30 15:44:05.586662] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707567 ] 00:06:57.950 [2024-11-30 15:44:05.720879] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:57.951 [2024-11-30 15:44:05.779269] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1707303 has claimed it. 00:06:57.951 [2024-11-30 15:44:05.779303] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:58.519 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1707567) - No such process 00:06:58.519 ERROR: process (pid: 1707567) is no longer running 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 1707303 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1707303 00:06:58.519 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:58.779 lslocks: write error 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 1707303 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 1707303 ']' 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 1707303 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1707303 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1707303' 00:06:58.779 killing process with pid 1707303 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 1707303 00:06:58.779 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 1707303 00:06:59.038 00:06:59.038 real 0m2.252s 00:06:59.038 user 0m2.396s 00:06:59.038 sys 0m0.664s 00:06:59.038 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:59.038 15:44:06 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:59.038 ************************************ 00:06:59.038 END TEST locking_app_on_locked_coremask 00:06:59.038 ************************************ 00:06:59.038 15:44:06 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:59.038 15:44:06 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:59.038 15:44:06 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:59.038 15:44:06 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:59.298 ************************************ 00:06:59.298 START TEST locking_overlapped_coremask 00:06:59.298 ************************************ 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1707859 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 1707859 /var/tmp/spdk.sock 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 1707859 ']' 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:59.298 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:59.298 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:59.298 [2024-11-30 15:44:07.042183] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:59.298 [2024-11-30 15:44:07.042256] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707859 ] 00:06:59.298 [2024-11-30 15:44:07.177957] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:59.298 [2024-11-30 15:44:07.212521] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:59.298 [2024-11-30 15:44:07.235164] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:59.298 [2024-11-30 15:44:07.235182] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:06:59.298 [2024-11-30 15:44:07.235184] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1707881 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1707881 /var/tmp/spdk2.sock 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 1707881 /var/tmp/spdk2.sock 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 1707881 /var/tmp/spdk2.sock 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 1707881 ']' 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:00.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:00.236 15:44:07 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:00.236 [2024-11-30 15:44:07.915075] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:00.236 [2024-11-30 15:44:07.915142] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707881 ] 00:07:00.236 [2024-11-30 15:44:08.054216] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:00.236 [2024-11-30 15:44:08.116953] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1707859 has claimed it. 00:07:00.236 [2024-11-30 15:44:08.116987] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:00.804 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (1707881) - No such process 00:07:00.804 ERROR: process (pid: 1707881) is no longer running 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 1707859 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 1707859 ']' 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 1707859 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1707859 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1707859' 00:07:00.804 killing process with pid 1707859 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 1707859 00:07:00.804 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 1707859 00:07:01.064 00:07:01.064 real 0m1.918s 00:07:01.064 user 0m5.306s 00:07:01.064 sys 0m0.463s 00:07:01.064 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:01.064 15:44:08 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:01.064 ************************************ 00:07:01.064 END TEST locking_overlapped_coremask 00:07:01.064 ************************************ 00:07:01.064 15:44:08 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:01.064 15:44:08 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:01.064 15:44:08 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:01.064 15:44:08 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:01.064 ************************************ 00:07:01.064 START TEST locking_overlapped_coremask_via_rpc 00:07:01.064 ************************************ 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1708171 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 1708171 /var/tmp/spdk.sock 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1708171 ']' 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:01.064 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:01.064 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:01.324 [2024-11-30 15:44:09.035763] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:01.324 [2024-11-30 15:44:09.035819] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1708171 ] 00:07:01.324 [2024-11-30 15:44:09.171515] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:01.324 [2024-11-30 15:44:09.206747] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:01.324 [2024-11-30 15:44:09.206769] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:01.324 [2024-11-30 15:44:09.231824] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:01.324 [2024-11-30 15:44:09.231919] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:01.324 [2024-11-30 15:44:09.231922] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1708358 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 1708358 /var/tmp/spdk2.sock 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1708358 ']' 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:02.263 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:02.263 15:44:09 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:02.263 [2024-11-30 15:44:09.910429] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:02.263 [2024-11-30 15:44:09.910505] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1708358 ] 00:07:02.263 [2024-11-30 15:44:10.048477] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:02.263 [2024-11-30 15:44:10.111608] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:02.263 [2024-11-30 15:44:10.111644] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:02.263 [2024-11-30 15:44:10.161339] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:02.263 [2024-11-30 15:44:10.164646] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:02.263 [2024-11-30 15:44:10.164648] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:07:02.832 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:02.832 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:02.832 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:02.832 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:02.832 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:02.832 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:02.832 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:02.832 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:03.092 [2024-11-30 15:44:10.807660] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1708171 has claimed it. 00:07:03.092 request: 00:07:03.092 { 00:07:03.092 "method": "framework_enable_cpumask_locks", 00:07:03.092 "req_id": 1 00:07:03.092 } 00:07:03.092 Got JSON-RPC error response 00:07:03.092 response: 00:07:03.092 { 00:07:03.092 "code": -32603, 00:07:03.092 "message": "Failed to claim CPU core: 2" 00:07:03.092 } 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 1708171 /var/tmp/spdk.sock 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1708171 ']' 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:03.092 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:03.092 15:44:10 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 1708358 /var/tmp/spdk2.sock 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 1708358 ']' 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:03.092 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:03.092 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:03.352 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:03.352 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:03.352 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:03.352 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:03.352 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:03.352 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:03.352 00:07:03.352 real 0m2.206s 00:07:03.352 user 0m0.946s 00:07:03.352 sys 0m0.194s 00:07:03.352 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:03.352 15:44:11 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:03.352 ************************************ 00:07:03.352 END TEST locking_overlapped_coremask_via_rpc 00:07:03.352 ************************************ 00:07:03.352 15:44:11 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:03.352 15:44:11 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1708171 ]] 00:07:03.352 15:44:11 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1708171 00:07:03.353 15:44:11 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1708171 ']' 00:07:03.353 15:44:11 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1708171 00:07:03.353 15:44:11 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:03.353 15:44:11 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:03.353 15:44:11 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1708171 00:07:03.612 15:44:11 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:03.612 15:44:11 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:03.612 15:44:11 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1708171' 00:07:03.612 killing process with pid 1708171 00:07:03.612 15:44:11 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 1708171 00:07:03.612 15:44:11 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 1708171 00:07:03.871 15:44:11 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1708358 ]] 00:07:03.871 15:44:11 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1708358 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1708358 ']' 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1708358 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1708358 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1708358' 00:07:03.871 killing process with pid 1708358 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 1708358 00:07:03.871 15:44:11 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 1708358 00:07:04.131 15:44:11 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:04.131 15:44:11 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:04.131 15:44:11 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1708171 ]] 00:07:04.131 15:44:11 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1708171 00:07:04.131 15:44:11 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1708171 ']' 00:07:04.131 15:44:11 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1708171 00:07:04.131 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (1708171) - No such process 00:07:04.131 15:44:11 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 1708171 is not found' 00:07:04.131 Process with pid 1708171 is not found 00:07:04.131 15:44:11 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1708358 ]] 00:07:04.131 15:44:11 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1708358 00:07:04.131 15:44:11 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 1708358 ']' 00:07:04.131 15:44:11 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 1708358 00:07:04.131 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (1708358) - No such process 00:07:04.131 15:44:11 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 1708358 is not found' 00:07:04.131 Process with pid 1708358 is not found 00:07:04.131 15:44:11 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:04.131 00:07:04.131 real 0m18.690s 00:07:04.131 user 0m30.954s 00:07:04.131 sys 0m6.015s 00:07:04.131 15:44:11 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:04.131 15:44:11 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:04.131 ************************************ 00:07:04.131 END TEST cpu_locks 00:07:04.131 ************************************ 00:07:04.131 00:07:04.131 real 0m45.241s 00:07:04.131 user 1m24.314s 00:07:04.131 sys 0m10.406s 00:07:04.131 15:44:12 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:04.131 15:44:12 event -- common/autotest_common.sh@10 -- # set +x 00:07:04.131 ************************************ 00:07:04.131 END TEST event 00:07:04.131 ************************************ 00:07:04.131 15:44:12 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:04.131 15:44:12 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:04.131 15:44:12 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:04.131 15:44:12 -- common/autotest_common.sh@10 -- # set +x 00:07:04.391 ************************************ 00:07:04.391 START TEST thread 00:07:04.391 ************************************ 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:04.391 * Looking for test storage... 00:07:04.391 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1693 -- # lcov --version 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:04.391 15:44:12 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:04.391 15:44:12 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:04.391 15:44:12 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:04.391 15:44:12 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:04.391 15:44:12 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:04.391 15:44:12 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:04.391 15:44:12 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:04.391 15:44:12 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:04.391 15:44:12 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:04.391 15:44:12 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:04.391 15:44:12 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:04.391 15:44:12 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:04.391 15:44:12 thread -- scripts/common.sh@345 -- # : 1 00:07:04.391 15:44:12 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:04.391 15:44:12 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:04.391 15:44:12 thread -- scripts/common.sh@365 -- # decimal 1 00:07:04.391 15:44:12 thread -- scripts/common.sh@353 -- # local d=1 00:07:04.391 15:44:12 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:04.391 15:44:12 thread -- scripts/common.sh@355 -- # echo 1 00:07:04.391 15:44:12 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:04.391 15:44:12 thread -- scripts/common.sh@366 -- # decimal 2 00:07:04.391 15:44:12 thread -- scripts/common.sh@353 -- # local d=2 00:07:04.391 15:44:12 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:04.391 15:44:12 thread -- scripts/common.sh@355 -- # echo 2 00:07:04.391 15:44:12 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:04.391 15:44:12 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:04.391 15:44:12 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:04.391 15:44:12 thread -- scripts/common.sh@368 -- # return 0 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:04.391 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.391 --rc genhtml_branch_coverage=1 00:07:04.391 --rc genhtml_function_coverage=1 00:07:04.391 --rc genhtml_legend=1 00:07:04.391 --rc geninfo_all_blocks=1 00:07:04.391 --rc geninfo_unexecuted_blocks=1 00:07:04.391 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.391 ' 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:04.391 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.391 --rc genhtml_branch_coverage=1 00:07:04.391 --rc genhtml_function_coverage=1 00:07:04.391 --rc genhtml_legend=1 00:07:04.391 --rc geninfo_all_blocks=1 00:07:04.391 --rc geninfo_unexecuted_blocks=1 00:07:04.391 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.391 ' 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:04.391 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.391 --rc genhtml_branch_coverage=1 00:07:04.391 --rc genhtml_function_coverage=1 00:07:04.391 --rc genhtml_legend=1 00:07:04.391 --rc geninfo_all_blocks=1 00:07:04.391 --rc geninfo_unexecuted_blocks=1 00:07:04.391 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.391 ' 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:04.391 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.391 --rc genhtml_branch_coverage=1 00:07:04.391 --rc genhtml_function_coverage=1 00:07:04.391 --rc genhtml_legend=1 00:07:04.391 --rc geninfo_all_blocks=1 00:07:04.391 --rc geninfo_unexecuted_blocks=1 00:07:04.391 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.391 ' 00:07:04.391 15:44:12 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:04.391 15:44:12 thread -- common/autotest_common.sh@10 -- # set +x 00:07:04.391 ************************************ 00:07:04.391 START TEST thread_poller_perf 00:07:04.391 ************************************ 00:07:04.391 15:44:12 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:04.391 [2024-11-30 15:44:12.355559] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:04.391 [2024-11-30 15:44:12.355648] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1708816 ] 00:07:04.649 [2024-11-30 15:44:12.495532] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:04.649 [2024-11-30 15:44:12.528195] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.649 [2024-11-30 15:44:12.552641] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.649 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:06.026 [2024-11-30T14:44:13.990Z] ====================================== 00:07:06.026 [2024-11-30T14:44:13.990Z] busy:2499195498 (cyc) 00:07:06.026 [2024-11-30T14:44:13.990Z] total_run_count: 850000 00:07:06.026 [2024-11-30T14:44:13.990Z] tsc_hz: 2494100000 (cyc) 00:07:06.026 [2024-11-30T14:44:13.990Z] ====================================== 00:07:06.026 [2024-11-30T14:44:13.990Z] poller_cost: 2940 (cyc), 1178 (nsec) 00:07:06.026 00:07:06.026 real 0m1.248s 00:07:06.026 user 0m1.062s 00:07:06.026 sys 0m0.082s 00:07:06.026 15:44:13 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:06.026 15:44:13 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:06.026 ************************************ 00:07:06.026 END TEST thread_poller_perf 00:07:06.026 ************************************ 00:07:06.026 15:44:13 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:06.026 15:44:13 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:06.026 15:44:13 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:06.026 15:44:13 thread -- common/autotest_common.sh@10 -- # set +x 00:07:06.026 ************************************ 00:07:06.026 START TEST thread_poller_perf 00:07:06.026 ************************************ 00:07:06.026 15:44:13 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:06.026 [2024-11-30 15:44:13.684349] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:06.026 [2024-11-30 15:44:13.684437] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1709099 ] 00:07:06.026 [2024-11-30 15:44:13.823156] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:06.026 [2024-11-30 15:44:13.858767] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.026 [2024-11-30 15:44:13.882496] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.026 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:06.962 [2024-11-30T14:44:14.926Z] ====================================== 00:07:06.962 [2024-11-30T14:44:14.926Z] busy:2495682494 (cyc) 00:07:06.962 [2024-11-30T14:44:14.926Z] total_run_count: 13398000 00:07:06.962 [2024-11-30T14:44:14.926Z] tsc_hz: 2494100000 (cyc) 00:07:06.962 [2024-11-30T14:44:14.926Z] ====================================== 00:07:06.962 [2024-11-30T14:44:14.926Z] poller_cost: 186 (cyc), 74 (nsec) 00:07:06.962 00:07:06.962 real 0m1.243s 00:07:06.962 user 0m1.063s 00:07:06.962 sys 0m0.075s 00:07:06.962 15:44:14 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:06.962 15:44:14 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:06.962 ************************************ 00:07:06.962 END TEST thread_poller_perf 00:07:06.962 ************************************ 00:07:07.227 15:44:14 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:07.227 15:44:14 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:07.227 15:44:14 thread -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:07.227 15:44:14 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:07.227 15:44:14 thread -- common/autotest_common.sh@10 -- # set +x 00:07:07.227 ************************************ 00:07:07.227 START TEST thread_spdk_lock 00:07:07.227 ************************************ 00:07:07.227 15:44:14 thread.thread_spdk_lock -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:07.227 [2024-11-30 15:44:15.013905] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:07.227 [2024-11-30 15:44:15.014024] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1709385 ] 00:07:07.227 [2024-11-30 15:44:15.156040] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:07.227 [2024-11-30 15:44:15.191140] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:07.484 [2024-11-30 15:44:15.216332] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:07.484 [2024-11-30 15:44:15.216334] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.742 [2024-11-30 15:44:15.707148] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:07.742 [2024-11-30 15:44:15.707185] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3112:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:07.742 [2024-11-30 15:44:15.707195] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3067:sspin_stacks_print: *ERROR*: spinlock 0x1361200 00:07:08.001 [2024-11-30 15:44:15.707861] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:08.001 [2024-11-30 15:44:15.707964] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:08.001 [2024-11-30 15:44:15.707983] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:08.001 Starting test contend 00:07:08.001 Worker Delay Wait us Hold us Total us 00:07:08.001 0 3 171239 185739 356979 00:07:08.001 1 5 89037 286509 375547 00:07:08.001 PASS test contend 00:07:08.001 Starting test hold_by_poller 00:07:08.001 PASS test hold_by_poller 00:07:08.001 Starting test hold_by_message 00:07:08.001 PASS test hold_by_message 00:07:08.001 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:08.001 100014 assertions passed 00:07:08.001 0 assertions failed 00:07:08.001 00:07:08.001 real 0m0.741s 00:07:08.001 user 0m1.040s 00:07:08.001 sys 0m0.090s 00:07:08.001 15:44:15 thread.thread_spdk_lock -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.001 15:44:15 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:08.001 ************************************ 00:07:08.001 END TEST thread_spdk_lock 00:07:08.001 ************************************ 00:07:08.001 00:07:08.001 real 0m3.672s 00:07:08.001 user 0m3.353s 00:07:08.001 sys 0m0.535s 00:07:08.001 15:44:15 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.001 15:44:15 thread -- common/autotest_common.sh@10 -- # set +x 00:07:08.001 ************************************ 00:07:08.001 END TEST thread 00:07:08.001 ************************************ 00:07:08.001 15:44:15 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:08.001 15:44:15 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:08.001 15:44:15 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:08.001 15:44:15 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:08.001 15:44:15 -- common/autotest_common.sh@10 -- # set +x 00:07:08.001 ************************************ 00:07:08.001 START TEST app_cmdline 00:07:08.001 ************************************ 00:07:08.001 15:44:15 app_cmdline -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:08.001 * Looking for test storage... 00:07:08.259 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:08.259 15:44:15 app_cmdline -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:08.260 15:44:15 app_cmdline -- common/autotest_common.sh@1693 -- # lcov --version 00:07:08.260 15:44:15 app_cmdline -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:08.260 15:44:16 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:08.260 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.260 --rc genhtml_branch_coverage=1 00:07:08.260 --rc genhtml_function_coverage=1 00:07:08.260 --rc genhtml_legend=1 00:07:08.260 --rc geninfo_all_blocks=1 00:07:08.260 --rc geninfo_unexecuted_blocks=1 00:07:08.260 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.260 ' 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:08.260 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.260 --rc genhtml_branch_coverage=1 00:07:08.260 --rc genhtml_function_coverage=1 00:07:08.260 --rc genhtml_legend=1 00:07:08.260 --rc geninfo_all_blocks=1 00:07:08.260 --rc geninfo_unexecuted_blocks=1 00:07:08.260 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.260 ' 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:08.260 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.260 --rc genhtml_branch_coverage=1 00:07:08.260 --rc genhtml_function_coverage=1 00:07:08.260 --rc genhtml_legend=1 00:07:08.260 --rc geninfo_all_blocks=1 00:07:08.260 --rc geninfo_unexecuted_blocks=1 00:07:08.260 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.260 ' 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:08.260 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.260 --rc genhtml_branch_coverage=1 00:07:08.260 --rc genhtml_function_coverage=1 00:07:08.260 --rc genhtml_legend=1 00:07:08.260 --rc geninfo_all_blocks=1 00:07:08.260 --rc geninfo_unexecuted_blocks=1 00:07:08.260 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.260 ' 00:07:08.260 15:44:16 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:08.260 15:44:16 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1709630 00:07:08.260 15:44:16 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:08.260 15:44:16 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1709630 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 1709630 ']' 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:08.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:08.260 15:44:16 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:08.260 [2024-11-30 15:44:16.084739] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:08.260 [2024-11-30 15:44:16.084830] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1709630 ] 00:07:08.260 [2024-11-30 15:44:16.220636] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:08.519 [2024-11-30 15:44:16.256379] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.519 [2024-11-30 15:44:16.278496] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.084 15:44:16 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:09.084 15:44:16 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:07:09.084 15:44:16 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:09.343 { 00:07:09.343 "version": "SPDK v25.01-pre git sha1 35cd3e84d", 00:07:09.343 "fields": { 00:07:09.343 "major": 25, 00:07:09.343 "minor": 1, 00:07:09.343 "patch": 0, 00:07:09.343 "suffix": "-pre", 00:07:09.343 "commit": "35cd3e84d" 00:07:09.343 } 00:07:09.343 } 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:09.343 15:44:17 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:09.343 15:44:17 app_cmdline -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:09.603 request: 00:07:09.603 { 00:07:09.603 "method": "env_dpdk_get_mem_stats", 00:07:09.603 "req_id": 1 00:07:09.603 } 00:07:09.603 Got JSON-RPC error response 00:07:09.603 response: 00:07:09.603 { 00:07:09.603 "code": -32601, 00:07:09.603 "message": "Method not found" 00:07:09.603 } 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:09.603 15:44:17 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1709630 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 1709630 ']' 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 1709630 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 1709630 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 1709630' 00:07:09.603 killing process with pid 1709630 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@973 -- # kill 1709630 00:07:09.603 15:44:17 app_cmdline -- common/autotest_common.sh@978 -- # wait 1709630 00:07:09.862 00:07:09.862 real 0m1.822s 00:07:09.862 user 0m2.043s 00:07:09.862 sys 0m0.508s 00:07:09.862 15:44:17 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:09.862 15:44:17 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:09.862 ************************************ 00:07:09.862 END TEST app_cmdline 00:07:09.862 ************************************ 00:07:09.862 15:44:17 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:09.862 15:44:17 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:09.862 15:44:17 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:09.862 15:44:17 -- common/autotest_common.sh@10 -- # set +x 00:07:09.862 ************************************ 00:07:09.862 START TEST version 00:07:09.862 ************************************ 00:07:09.862 15:44:17 version -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:10.121 * Looking for test storage... 00:07:10.121 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:10.121 15:44:17 version -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:10.121 15:44:17 version -- common/autotest_common.sh@1693 -- # lcov --version 00:07:10.121 15:44:17 version -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:10.121 15:44:17 version -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:10.121 15:44:17 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.121 15:44:17 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.122 15:44:17 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.122 15:44:17 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.122 15:44:17 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.122 15:44:17 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.122 15:44:17 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.122 15:44:17 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.122 15:44:17 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.122 15:44:17 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.122 15:44:17 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.122 15:44:17 version -- scripts/common.sh@344 -- # case "$op" in 00:07:10.122 15:44:17 version -- scripts/common.sh@345 -- # : 1 00:07:10.122 15:44:17 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.122 15:44:17 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.122 15:44:17 version -- scripts/common.sh@365 -- # decimal 1 00:07:10.122 15:44:17 version -- scripts/common.sh@353 -- # local d=1 00:07:10.122 15:44:17 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.122 15:44:17 version -- scripts/common.sh@355 -- # echo 1 00:07:10.122 15:44:17 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.122 15:44:17 version -- scripts/common.sh@366 -- # decimal 2 00:07:10.122 15:44:17 version -- scripts/common.sh@353 -- # local d=2 00:07:10.122 15:44:17 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.122 15:44:17 version -- scripts/common.sh@355 -- # echo 2 00:07:10.122 15:44:17 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.122 15:44:17 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.122 15:44:17 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.122 15:44:17 version -- scripts/common.sh@368 -- # return 0 00:07:10.122 15:44:17 version -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.122 15:44:17 version -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:10.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.122 --rc genhtml_branch_coverage=1 00:07:10.122 --rc genhtml_function_coverage=1 00:07:10.122 --rc genhtml_legend=1 00:07:10.122 --rc geninfo_all_blocks=1 00:07:10.122 --rc geninfo_unexecuted_blocks=1 00:07:10.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.122 ' 00:07:10.122 15:44:17 version -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:10.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.122 --rc genhtml_branch_coverage=1 00:07:10.122 --rc genhtml_function_coverage=1 00:07:10.122 --rc genhtml_legend=1 00:07:10.122 --rc geninfo_all_blocks=1 00:07:10.122 --rc geninfo_unexecuted_blocks=1 00:07:10.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.122 ' 00:07:10.122 15:44:17 version -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:10.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.122 --rc genhtml_branch_coverage=1 00:07:10.122 --rc genhtml_function_coverage=1 00:07:10.122 --rc genhtml_legend=1 00:07:10.122 --rc geninfo_all_blocks=1 00:07:10.122 --rc geninfo_unexecuted_blocks=1 00:07:10.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.122 ' 00:07:10.122 15:44:17 version -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:10.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.122 --rc genhtml_branch_coverage=1 00:07:10.122 --rc genhtml_function_coverage=1 00:07:10.122 --rc genhtml_legend=1 00:07:10.122 --rc geninfo_all_blocks=1 00:07:10.122 --rc geninfo_unexecuted_blocks=1 00:07:10.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.122 ' 00:07:10.122 15:44:17 version -- app/version.sh@17 -- # get_header_version major 00:07:10.122 15:44:17 version -- app/version.sh@14 -- # tr -d '"' 00:07:10.122 15:44:17 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:10.122 15:44:17 version -- app/version.sh@14 -- # cut -f2 00:07:10.122 15:44:17 version -- app/version.sh@17 -- # major=25 00:07:10.122 15:44:17 version -- app/version.sh@18 -- # get_header_version minor 00:07:10.122 15:44:17 version -- app/version.sh@14 -- # tr -d '"' 00:07:10.122 15:44:17 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:10.122 15:44:17 version -- app/version.sh@14 -- # cut -f2 00:07:10.122 15:44:17 version -- app/version.sh@18 -- # minor=1 00:07:10.122 15:44:17 version -- app/version.sh@19 -- # get_header_version patch 00:07:10.122 15:44:17 version -- app/version.sh@14 -- # cut -f2 00:07:10.122 15:44:17 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:10.122 15:44:17 version -- app/version.sh@14 -- # tr -d '"' 00:07:10.122 15:44:17 version -- app/version.sh@19 -- # patch=0 00:07:10.122 15:44:17 version -- app/version.sh@20 -- # get_header_version suffix 00:07:10.122 15:44:17 version -- app/version.sh@14 -- # cut -f2 00:07:10.122 15:44:17 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:10.122 15:44:17 version -- app/version.sh@14 -- # tr -d '"' 00:07:10.122 15:44:17 version -- app/version.sh@20 -- # suffix=-pre 00:07:10.122 15:44:17 version -- app/version.sh@22 -- # version=25.1 00:07:10.122 15:44:17 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:10.122 15:44:17 version -- app/version.sh@28 -- # version=25.1rc0 00:07:10.122 15:44:17 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:10.122 15:44:17 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:10.122 15:44:18 version -- app/version.sh@30 -- # py_version=25.1rc0 00:07:10.122 15:44:18 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:07:10.122 00:07:10.122 real 0m0.267s 00:07:10.122 user 0m0.139s 00:07:10.122 sys 0m0.173s 00:07:10.122 15:44:18 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:10.122 15:44:18 version -- common/autotest_common.sh@10 -- # set +x 00:07:10.122 ************************************ 00:07:10.122 END TEST version 00:07:10.122 ************************************ 00:07:10.122 15:44:18 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:10.122 15:44:18 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:10.122 15:44:18 -- spdk/autotest.sh@194 -- # uname -s 00:07:10.122 15:44:18 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:10.122 15:44:18 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:10.122 15:44:18 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:10.122 15:44:18 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:10.122 15:44:18 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:07:10.122 15:44:18 -- spdk/autotest.sh@260 -- # timing_exit lib 00:07:10.122 15:44:18 -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:10.122 15:44:18 -- common/autotest_common.sh@10 -- # set +x 00:07:10.382 15:44:18 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:10.382 15:44:18 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:10.382 15:44:18 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:10.382 15:44:18 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:10.382 15:44:18 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:10.382 15:44:18 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:10.382 15:44:18 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:10.382 15:44:18 -- common/autotest_common.sh@10 -- # set +x 00:07:10.382 ************************************ 00:07:10.382 START TEST llvm_fuzz 00:07:10.382 ************************************ 00:07:10.382 15:44:18 llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:10.382 * Looking for test storage... 00:07:10.382 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:10.382 15:44:18 llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:10.382 15:44:18 llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:10.382 15:44:18 llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:10.382 15:44:18 llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:10.382 15:44:18 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.643 15:44:18 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:10.643 15:44:18 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.643 15:44:18 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.643 15:44:18 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.643 15:44:18 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:10.643 15:44:18 llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.643 15:44:18 llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:10.643 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.643 --rc genhtml_branch_coverage=1 00:07:10.643 --rc genhtml_function_coverage=1 00:07:10.644 --rc genhtml_legend=1 00:07:10.644 --rc geninfo_all_blocks=1 00:07:10.644 --rc geninfo_unexecuted_blocks=1 00:07:10.644 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.644 ' 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:10.644 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.644 --rc genhtml_branch_coverage=1 00:07:10.644 --rc genhtml_function_coverage=1 00:07:10.644 --rc genhtml_legend=1 00:07:10.644 --rc geninfo_all_blocks=1 00:07:10.644 --rc geninfo_unexecuted_blocks=1 00:07:10.644 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.644 ' 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:10.644 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.644 --rc genhtml_branch_coverage=1 00:07:10.644 --rc genhtml_function_coverage=1 00:07:10.644 --rc genhtml_legend=1 00:07:10.644 --rc geninfo_all_blocks=1 00:07:10.644 --rc geninfo_unexecuted_blocks=1 00:07:10.644 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.644 ' 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:10.644 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.644 --rc genhtml_branch_coverage=1 00:07:10.644 --rc genhtml_function_coverage=1 00:07:10.644 --rc genhtml_legend=1 00:07:10.644 --rc geninfo_all_blocks=1 00:07:10.644 --rc geninfo_unexecuted_blocks=1 00:07:10.644 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.644 ' 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@550 -- # fuzzers=() 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@550 -- # local fuzzers 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@552 -- # [[ -n '' ]] 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@555 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@556 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@559 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:10.644 15:44:18 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:10.644 15:44:18 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:10.644 ************************************ 00:07:10.644 START TEST nvmf_llvm_fuzz 00:07:10.644 ************************************ 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:10.644 * Looking for test storage... 00:07:10.644 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:10.644 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.644 --rc genhtml_branch_coverage=1 00:07:10.644 --rc genhtml_function_coverage=1 00:07:10.644 --rc genhtml_legend=1 00:07:10.644 --rc geninfo_all_blocks=1 00:07:10.644 --rc geninfo_unexecuted_blocks=1 00:07:10.644 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.644 ' 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:10.644 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.644 --rc genhtml_branch_coverage=1 00:07:10.644 --rc genhtml_function_coverage=1 00:07:10.644 --rc genhtml_legend=1 00:07:10.644 --rc geninfo_all_blocks=1 00:07:10.644 --rc geninfo_unexecuted_blocks=1 00:07:10.644 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.644 ' 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:10.644 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.644 --rc genhtml_branch_coverage=1 00:07:10.644 --rc genhtml_function_coverage=1 00:07:10.644 --rc genhtml_legend=1 00:07:10.644 --rc geninfo_all_blocks=1 00:07:10.644 --rc geninfo_unexecuted_blocks=1 00:07:10.644 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.644 ' 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:10.644 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.644 --rc genhtml_branch_coverage=1 00:07:10.644 --rc genhtml_function_coverage=1 00:07:10.644 --rc genhtml_legend=1 00:07:10.644 --rc geninfo_all_blocks=1 00:07:10.644 --rc geninfo_unexecuted_blocks=1 00:07:10.644 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.644 ' 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:10.644 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:10.645 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:10.646 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:10.646 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:10.646 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:10.909 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:10.909 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:10.909 #define SPDK_CONFIG_H 00:07:10.909 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:10.909 #define SPDK_CONFIG_APPS 1 00:07:10.909 #define SPDK_CONFIG_ARCH native 00:07:10.909 #undef SPDK_CONFIG_ASAN 00:07:10.909 #undef SPDK_CONFIG_AVAHI 00:07:10.909 #undef SPDK_CONFIG_CET 00:07:10.909 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:10.909 #define SPDK_CONFIG_COVERAGE 1 00:07:10.909 #define SPDK_CONFIG_CROSS_PREFIX 00:07:10.909 #undef SPDK_CONFIG_CRYPTO 00:07:10.909 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:10.909 #undef SPDK_CONFIG_CUSTOMOCF 00:07:10.909 #undef SPDK_CONFIG_DAOS 00:07:10.909 #define SPDK_CONFIG_DAOS_DIR 00:07:10.909 #define SPDK_CONFIG_DEBUG 1 00:07:10.909 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:10.909 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:10.909 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:10.909 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:10.909 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:10.909 #undef SPDK_CONFIG_DPDK_UADK 00:07:10.909 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:10.909 #define SPDK_CONFIG_EXAMPLES 1 00:07:10.909 #undef SPDK_CONFIG_FC 00:07:10.909 #define SPDK_CONFIG_FC_PATH 00:07:10.909 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:10.910 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:10.910 #define SPDK_CONFIG_FSDEV 1 00:07:10.910 #undef SPDK_CONFIG_FUSE 00:07:10.910 #define SPDK_CONFIG_FUZZER 1 00:07:10.910 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:10.910 #undef SPDK_CONFIG_GOLANG 00:07:10.910 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:10.910 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:10.910 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:10.910 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:10.910 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:10.910 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:10.910 #undef SPDK_CONFIG_HAVE_LZ4 00:07:10.910 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:10.910 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:10.910 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:10.910 #define SPDK_CONFIG_IDXD 1 00:07:10.910 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:10.910 #undef SPDK_CONFIG_IPSEC_MB 00:07:10.910 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:10.910 #define SPDK_CONFIG_ISAL 1 00:07:10.910 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:10.910 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:10.910 #define SPDK_CONFIG_LIBDIR 00:07:10.910 #undef SPDK_CONFIG_LTO 00:07:10.910 #define SPDK_CONFIG_MAX_LCORES 128 00:07:10.910 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:07:10.910 #define SPDK_CONFIG_NVME_CUSE 1 00:07:10.910 #undef SPDK_CONFIG_OCF 00:07:10.910 #define SPDK_CONFIG_OCF_PATH 00:07:10.910 #define SPDK_CONFIG_OPENSSL_PATH 00:07:10.910 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:10.910 #define SPDK_CONFIG_PGO_DIR 00:07:10.910 #undef SPDK_CONFIG_PGO_USE 00:07:10.910 #define SPDK_CONFIG_PREFIX /usr/local 00:07:10.910 #undef SPDK_CONFIG_RAID5F 00:07:10.910 #undef SPDK_CONFIG_RBD 00:07:10.910 #define SPDK_CONFIG_RDMA 1 00:07:10.910 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:10.910 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:10.910 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:10.910 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:10.910 #undef SPDK_CONFIG_SHARED 00:07:10.910 #undef SPDK_CONFIG_SMA 00:07:10.910 #define SPDK_CONFIG_TESTS 1 00:07:10.910 #undef SPDK_CONFIG_TSAN 00:07:10.910 #define SPDK_CONFIG_UBLK 1 00:07:10.910 #define SPDK_CONFIG_UBSAN 1 00:07:10.910 #undef SPDK_CONFIG_UNIT_TESTS 00:07:10.910 #undef SPDK_CONFIG_URING 00:07:10.910 #define SPDK_CONFIG_URING_PATH 00:07:10.910 #undef SPDK_CONFIG_URING_ZNS 00:07:10.910 #undef SPDK_CONFIG_USDT 00:07:10.910 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:10.910 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:10.910 #define SPDK_CONFIG_VFIO_USER 1 00:07:10.910 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:10.910 #define SPDK_CONFIG_VHOST 1 00:07:10.910 #define SPDK_CONFIG_VIRTIO 1 00:07:10.910 #undef SPDK_CONFIG_VTUNE 00:07:10.910 #define SPDK_CONFIG_VTUNE_DIR 00:07:10.910 #define SPDK_CONFIG_WERROR 1 00:07:10.910 #define SPDK_CONFIG_WPDK_DIR 00:07:10.910 #undef SPDK_CONFIG_XNVME 00:07:10.910 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:10.910 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : main 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:10.911 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 1710167 ]] 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 1710167 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:07:10.912 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.JmyAux 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.JmyAux/tests/nvmf /tmp/spdk.JmyAux 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=51140128768 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730607104 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=10590478336 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30860537856 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865301504 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=4763648 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340129792 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346122240 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5992448 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30863253504 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865305600 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=2052096 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:07:10.913 * Looking for test storage... 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=51140128768 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=12805070848 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:10.913 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.913 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:10.914 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.914 --rc genhtml_branch_coverage=1 00:07:10.914 --rc genhtml_function_coverage=1 00:07:10.914 --rc genhtml_legend=1 00:07:10.914 --rc geninfo_all_blocks=1 00:07:10.914 --rc geninfo_unexecuted_blocks=1 00:07:10.914 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.914 ' 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:10.914 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.914 --rc genhtml_branch_coverage=1 00:07:10.914 --rc genhtml_function_coverage=1 00:07:10.914 --rc genhtml_legend=1 00:07:10.914 --rc geninfo_all_blocks=1 00:07:10.914 --rc geninfo_unexecuted_blocks=1 00:07:10.914 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.914 ' 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:10.914 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.914 --rc genhtml_branch_coverage=1 00:07:10.914 --rc genhtml_function_coverage=1 00:07:10.914 --rc genhtml_legend=1 00:07:10.914 --rc geninfo_all_blocks=1 00:07:10.914 --rc geninfo_unexecuted_blocks=1 00:07:10.914 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.914 ' 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:10.914 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.914 --rc genhtml_branch_coverage=1 00:07:10.914 --rc genhtml_function_coverage=1 00:07:10.914 --rc genhtml_legend=1 00:07:10.914 --rc geninfo_all_blocks=1 00:07:10.914 --rc geninfo_unexecuted_blocks=1 00:07:10.914 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.914 ' 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:10.914 15:44:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:10.914 [2024-11-30 15:44:18.857710] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:10.914 [2024-11-30 15:44:18.857778] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1710239 ] 00:07:11.483 [2024-11-30 15:44:19.183077] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:11.483 [2024-11-30 15:44:19.230355] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.483 [2024-11-30 15:44:19.245888] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.483 [2024-11-30 15:44:19.298318] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:11.483 [2024-11-30 15:44:19.314638] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:11.483 INFO: Running with entropic power schedule (0xFF, 100). 00:07:11.483 INFO: Seed: 1829952437 00:07:11.483 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:11.484 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:11.484 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:11.484 INFO: A corpus is not provided, starting from an empty corpus 00:07:11.484 #2 INITED exec/s: 0 rss: 64Mb 00:07:11.484 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:11.484 This may also happen if the target rejected all inputs we tried so far 00:07:11.484 [2024-11-30 15:44:19.369877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:11.484 [2024-11-30 15:44:19.369906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.743 NEW_FUNC[1/717]: 0x45ed08 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:11.743 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:11.743 #6 NEW cov: 12252 ft: 12251 corp: 2/78b lim: 320 exec/s: 0 rss: 72Mb L: 77/77 MS: 4 ChangeByte-InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:11.743 [2024-11-30 15:44:19.699975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES LBA RANGE TYPE cid:4 cdw10:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:11.743 [2024-11-30 15:44:19.700008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.003 #8 NEW cov: 12391 ft: 12936 corp: 3/145b lim: 320 exec/s: 0 rss: 72Mb L: 67/77 MS: 2 InsertByte-CrossOver- 00:07:12.003 [2024-11-30 15:44:19.739899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.003 [2024-11-30 15:44:19.739925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.003 #11 NEW cov: 12407 ft: 13285 corp: 4/223b lim: 320 exec/s: 0 rss: 72Mb L: 78/78 MS: 3 ChangeBit-CopyPart-CrossOver- 00:07:12.003 [2024-11-30 15:44:19.779870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.003 [2024-11-30 15:44:19.779896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.003 #17 NEW cov: 12492 ft: 13552 corp: 5/300b lim: 320 exec/s: 0 rss: 72Mb L: 77/78 MS: 1 CrossOver- 00:07:12.003 [2024-11-30 15:44:19.839953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES LBA RANGE TYPE cid:4 cdw10:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.003 [2024-11-30 15:44:19.839979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.003 #18 NEW cov: 12492 ft: 13713 corp: 6/368b lim: 320 exec/s: 0 rss: 72Mb L: 68/78 MS: 1 InsertByte- 00:07:12.003 [2024-11-30 15:44:19.899934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.003 [2024-11-30 15:44:19.899960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.003 #19 NEW cov: 12492 ft: 13779 corp: 7/445b lim: 320 exec/s: 0 rss: 72Mb L: 77/78 MS: 1 ChangeByte- 00:07:12.003 [2024-11-30 15:44:19.940304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.003 [2024-11-30 15:44:19.940331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.003 [2024-11-30 15:44:19.940401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.003 [2024-11-30 15:44:19.940419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.003 #25 NEW cov: 12495 ft: 14004 corp: 8/601b lim: 320 exec/s: 0 rss: 72Mb L: 156/156 MS: 1 InsertRepeatedBytes- 00:07:12.264 [2024-11-30 15:44:19.980095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.264 [2024-11-30 15:44:19.980120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.264 [2024-11-30 15:44:19.980176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.264 [2024-11-30 15:44:19.980190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.264 #26 NEW cov: 12495 ft: 14089 corp: 9/754b lim: 320 exec/s: 0 rss: 72Mb L: 153/156 MS: 1 CrossOver- 00:07:12.264 [2024-11-30 15:44:20.040173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.264 [2024-11-30 15:44:20.040203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.264 [2024-11-30 15:44:20.040257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:03030303 cdw11:03030303 00:07:12.264 [2024-11-30 15:44:20.040272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.264 #27 NEW cov: 12495 ft: 14167 corp: 10/938b lim: 320 exec/s: 0 rss: 72Mb L: 184/184 MS: 1 CopyPart- 00:07:12.264 [2024-11-30 15:44:20.080040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.264 [2024-11-30 15:44:20.080071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.264 #28 NEW cov: 12495 ft: 14256 corp: 11/1015b lim: 320 exec/s: 0 rss: 72Mb L: 77/184 MS: 1 ChangeByte- 00:07:12.264 [2024-11-30 15:44:20.140066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x30303030303 00:07:12.264 [2024-11-30 15:44:20.140095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.264 #29 NEW cov: 12495 ft: 14313 corp: 12/1092b lim: 320 exec/s: 0 rss: 72Mb L: 77/184 MS: 1 CMP- DE: "\000\000\000\002"- 00:07:12.264 [2024-11-30 15:44:20.180187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.264 [2024-11-30 15:44:20.180213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.264 [2024-11-30 15:44:20.180269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.264 [2024-11-30 15:44:20.180283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.264 #30 NEW cov: 12495 ft: 14321 corp: 13/1248b lim: 320 exec/s: 0 rss: 72Mb L: 156/184 MS: 1 ChangeBit- 00:07:12.264 [2024-11-30 15:44:20.220094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f3) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x30303030303 00:07:12.264 [2024-11-30 15:44:20.220120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.524 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:12.524 #31 NEW cov: 12518 ft: 14416 corp: 14/1325b lim: 320 exec/s: 0 rss: 72Mb L: 77/184 MS: 1 ChangeBinInt- 00:07:12.524 [2024-11-30 15:44:20.280142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.524 [2024-11-30 15:44:20.280169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.524 #32 NEW cov: 12518 ft: 14459 corp: 15/1402b lim: 320 exec/s: 0 rss: 73Mb L: 77/184 MS: 1 CopyPart- 00:07:12.524 [2024-11-30 15:44:20.340178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:39390303 cdw10:39393939 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3939393939393939 00:07:12.524 [2024-11-30 15:44:20.340205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.524 #35 NEW cov: 12518 ft: 14471 corp: 16/1521b lim: 320 exec/s: 35 rss: 73Mb L: 119/184 MS: 3 EraseBytes-CopyPart-InsertRepeatedBytes- 00:07:12.524 [2024-11-30 15:44:20.380200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f3) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5600030303030303 00:07:12.524 [2024-11-30 15:44:20.380227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.524 #36 NEW cov: 12518 ft: 14489 corp: 17/1599b lim: 320 exec/s: 36 rss: 73Mb L: 78/184 MS: 1 InsertByte- 00:07:12.524 [2024-11-30 15:44:20.440368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.524 [2024-11-30 15:44:20.440394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.524 [2024-11-30 15:44:20.440452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.524 [2024-11-30 15:44:20.440466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.524 #37 NEW cov: 12518 ft: 14512 corp: 18/1752b lim: 320 exec/s: 37 rss: 73Mb L: 153/184 MS: 1 CrossOver- 00:07:12.784 [2024-11-30 15:44:20.500405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.784 [2024-11-30 15:44:20.500431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.784 [2024-11-30 15:44:20.500484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.784 [2024-11-30 15:44:20.500499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.784 #38 NEW cov: 12518 ft: 14550 corp: 19/1905b lim: 320 exec/s: 38 rss: 73Mb L: 153/184 MS: 1 ChangeBinInt- 00:07:12.784 [2024-11-30 15:44:20.540382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:01030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.784 [2024-11-30 15:44:20.540409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.784 [2024-11-30 15:44:20.540463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.784 [2024-11-30 15:44:20.540477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.784 #39 NEW cov: 12518 ft: 14559 corp: 20/2058b lim: 320 exec/s: 39 rss: 73Mb L: 153/184 MS: 1 ChangeBit- 00:07:12.784 [2024-11-30 15:44:20.580402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:303f803 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.784 [2024-11-30 15:44:20.580433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.784 [2024-11-30 15:44:20.580488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.784 [2024-11-30 15:44:20.580503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.784 #40 NEW cov: 12518 ft: 14568 corp: 21/2211b lim: 320 exec/s: 40 rss: 73Mb L: 153/184 MS: 1 ChangeBinInt- 00:07:12.784 [2024-11-30 15:44:20.620321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f3) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5600030303030303 00:07:12.785 [2024-11-30 15:44:20.620346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.785 #41 NEW cov: 12518 ft: 14612 corp: 22/2289b lim: 320 exec/s: 41 rss: 73Mb L: 78/184 MS: 1 ChangeBit- 00:07:12.785 [2024-11-30 15:44:20.680506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030307 cdw10:03030303 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:12.785 [2024-11-30 15:44:20.680532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.785 [2024-11-30 15:44:20.680587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.785 [2024-11-30 15:44:20.680606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.785 #42 NEW cov: 12518 ft: 14655 corp: 23/2449b lim: 320 exec/s: 42 rss: 73Mb L: 160/184 MS: 1 CMP- DE: "\377\377\377\007"- 00:07:12.785 [2024-11-30 15:44:20.720538] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5454545454545454 00:07:12.785 [2024-11-30 15:44:20.720564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.785 [2024-11-30 15:44:20.720627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (54) qid:0 cid:5 nsid:3035454 cdw10:03030303 cdw11:03030303 00:07:12.785 [2024-11-30 15:44:20.720642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.785 #43 NEW cov: 12518 ft: 14688 corp: 24/2582b lim: 320 exec/s: 43 rss: 73Mb L: 133/184 MS: 1 InsertRepeatedBytes- 00:07:13.045 [2024-11-30 15:44:20.760419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.045 [2024-11-30 15:44:20.760445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.045 #44 NEW cov: 12518 ft: 14801 corp: 25/2661b lim: 320 exec/s: 44 rss: 73Mb L: 79/184 MS: 1 InsertByte- 00:07:13.045 [2024-11-30 15:44:20.820530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.045 [2024-11-30 15:44:20.820555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.045 [2024-11-30 15:44:20.820629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.045 [2024-11-30 15:44:20.820644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.045 #45 NEW cov: 12518 ft: 14813 corp: 26/2814b lim: 320 exec/s: 45 rss: 73Mb L: 153/184 MS: 1 ChangeBinInt- 00:07:13.045 [2024-11-30 15:44:20.860430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.045 [2024-11-30 15:44:20.860459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.045 #46 NEW cov: 12518 ft: 14841 corp: 27/2891b lim: 320 exec/s: 46 rss: 73Mb L: 77/184 MS: 1 ChangeBinInt- 00:07:13.045 [2024-11-30 15:44:20.900656] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5454545454545454 00:07:13.045 [2024-11-30 15:44:20.900682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.045 [2024-11-30 15:44:20.900755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (54) qid:0 cid:5 nsid:54545454 cdw10:03030303 cdw11:03030303 00:07:13.045 [2024-11-30 15:44:20.900770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.045 #47 NEW cov: 12518 ft: 14852 corp: 28/3032b lim: 320 exec/s: 47 rss: 73Mb L: 141/184 MS: 1 CopyPart- 00:07:13.045 [2024-11-30 15:44:20.960559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.045 [2024-11-30 15:44:20.960585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.045 [2024-11-30 15:44:20.960661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.045 [2024-11-30 15:44:20.960676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.045 #48 NEW cov: 12518 ft: 14879 corp: 29/3185b lim: 320 exec/s: 48 rss: 73Mb L: 153/184 MS: 1 CrossOver- 00:07:13.305 [2024-11-30 15:44:21.020460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.305 [2024-11-30 15:44:21.020486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.305 #49 NEW cov: 12518 ft: 14891 corp: 30/3262b lim: 320 exec/s: 49 rss: 73Mb L: 77/184 MS: 1 ShuffleBytes- 00:07:13.305 [2024-11-30 15:44:21.080623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.305 [2024-11-30 15:44:21.080649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.305 [2024-11-30 15:44:21.080723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.305 [2024-11-30 15:44:21.080737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.305 #50 NEW cov: 12518 ft: 14902 corp: 31/3416b lim: 320 exec/s: 50 rss: 74Mb L: 154/184 MS: 1 InsertByte- 00:07:13.305 [2024-11-30 15:44:21.140500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.305 [2024-11-30 15:44:21.140525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.305 #51 NEW cov: 12518 ft: 14921 corp: 32/3493b lim: 320 exec/s: 51 rss: 74Mb L: 77/184 MS: 1 ShuffleBytes- 00:07:13.305 [2024-11-30 15:44:21.200898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.305 [2024-11-30 15:44:21.200923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.305 [2024-11-30 15:44:21.201004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d5) qid:0 cid:5 nsid:d5d5d5d5 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.305 [2024-11-30 15:44:21.201021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.305 [2024-11-30 15:44:21.201084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d5) qid:0 cid:6 nsid:d5d5d5d5 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.305 [2024-11-30 15:44:21.201098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.305 NEW_FUNC[1/1]: 0x1997408 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:13.305 #52 NEW cov: 12531 ft: 15547 corp: 33/3741b lim: 320 exec/s: 52 rss: 74Mb L: 248/248 MS: 1 InsertRepeatedBytes- 00:07:13.305 [2024-11-30 15:44:21.240541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:13.305 [2024-11-30 15:44:21.240566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.305 #53 NEW cov: 12531 ft: 15560 corp: 34/3818b lim: 320 exec/s: 53 rss: 74Mb L: 77/248 MS: 1 CopyPart- 00:07:13.566 [2024-11-30 15:44:21.280548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f3) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5600030303030303 00:07:13.566 [2024-11-30 15:44:21.280573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.566 #54 NEW cov: 12531 ft: 15573 corp: 35/3896b lim: 320 exec/s: 54 rss: 74Mb L: 78/248 MS: 1 ChangeBinInt- 00:07:13.566 [2024-11-30 15:44:21.320573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x30303030303 00:07:13.566 [2024-11-30 15:44:21.320604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.566 #55 NEW cov: 12531 ft: 15591 corp: 36/3973b lim: 320 exec/s: 55 rss: 74Mb L: 77/248 MS: 1 ChangeByte- 00:07:13.566 [2024-11-30 15:44:21.360593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:5b030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x30303030303 00:07:13.566 [2024-11-30 15:44:21.360622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.566 #56 NEW cov: 12531 ft: 15597 corp: 37/4050b lim: 320 exec/s: 28 rss: 74Mb L: 77/248 MS: 1 ChangeByte- 00:07:13.566 #56 DONE cov: 12531 ft: 15597 corp: 37/4050b lim: 320 exec/s: 28 rss: 74Mb 00:07:13.566 ###### Recommended dictionary. ###### 00:07:13.566 "\000\000\000\002" # Uses: 0 00:07:13.566 "\377\377\377\007" # Uses: 0 00:07:13.566 ###### End of recommended dictionary. ###### 00:07:13.566 Done 56 runs in 2 second(s) 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:13.566 15:44:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:13.566 [2024-11-30 15:44:21.529674] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:13.566 [2024-11-30 15:44:21.529745] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1710762 ] 00:07:14.135 [2024-11-30 15:44:21.843041] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:14.135 [2024-11-30 15:44:21.889677] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.135 [2024-11-30 15:44:21.907669] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.135 [2024-11-30 15:44:21.960265] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:14.135 [2024-11-30 15:44:21.976580] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:14.135 INFO: Running with entropic power schedule (0xFF, 100). 00:07:14.135 INFO: Seed: 195985440 00:07:14.135 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:14.135 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:14.135 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:14.135 INFO: A corpus is not provided, starting from an empty corpus 00:07:14.135 #2 INITED exec/s: 0 rss: 64Mb 00:07:14.135 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:14.135 This may also happen if the target rejected all inputs we tried so far 00:07:14.135 [2024-11-30 15:44:22.052891] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (271360) > buf size (4096) 00:07:14.135 [2024-11-30 15:44:22.053334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.135 [2024-11-30 15:44:22.053373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.703 NEW_FUNC[1/717]: 0x45f608 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:14.703 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:14.703 #16 NEW cov: 12335 ft: 12330 corp: 2/8b lim: 30 exec/s: 0 rss: 72Mb L: 7/7 MS: 4 CMP-EraseBytes-ChangeBinInt-CMP- DE: "\377\377\377\377"-"\021\000\000\000"- 00:07:14.703 [2024-11-30 15:44:22.402350] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.402721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.402763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.703 #17 NEW cov: 12454 ft: 13089 corp: 3/15b lim: 30 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:14.703 [2024-11-30 15:44:22.472638] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.472806] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.472970] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.473125] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.473276] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.473616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.473648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.703 [2024-11-30 15:44:22.473771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.473788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.703 [2024-11-30 15:44:22.473910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.473929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.703 [2024-11-30 15:44:22.474050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.474068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.703 [2024-11-30 15:44:22.474191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.474209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:14.703 #18 NEW cov: 12460 ft: 14067 corp: 4/45b lim: 30 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:14.703 [2024-11-30 15:44:22.542717] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.542905] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.543063] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000029ff 00:07:14.703 [2024-11-30 15:44:22.543211] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.543371] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.703 [2024-11-30 15:44:22.543742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.543774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.703 [2024-11-30 15:44:22.543896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.543915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.703 [2024-11-30 15:44:22.544027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.703 [2024-11-30 15:44:22.544046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.703 [2024-11-30 15:44:22.544169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.704 [2024-11-30 15:44:22.544186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.704 [2024-11-30 15:44:22.544309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.704 [2024-11-30 15:44:22.544328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:14.704 #19 NEW cov: 12545 ft: 14365 corp: 5/75b lim: 30 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 ChangeByte- 00:07:14.704 [2024-11-30 15:44:22.612443] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.704 [2024-11-30 15:44:22.612820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ac cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.704 [2024-11-30 15:44:22.612851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.704 #20 NEW cov: 12545 ft: 14454 corp: 6/82b lim: 30 exec/s: 0 rss: 72Mb L: 7/30 MS: 1 ChangeByte- 00:07:14.704 [2024-11-30 15:44:22.662478] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.704 [2024-11-30 15:44:22.662826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.704 [2024-11-30 15:44:22.662855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.963 #26 NEW cov: 12545 ft: 14504 corp: 7/93b lim: 30 exec/s: 0 rss: 72Mb L: 11/30 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:14.963 [2024-11-30 15:44:22.732735] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.963 [2024-11-30 15:44:22.732914] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.963 [2024-11-30 15:44:22.733084] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.963 [2024-11-30 15:44:22.733427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.963 [2024-11-30 15:44:22.733457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.963 [2024-11-30 15:44:22.733589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.963 [2024-11-30 15:44:22.733611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.963 [2024-11-30 15:44:22.733723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.963 [2024-11-30 15:44:22.733745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.963 #27 NEW cov: 12545 ft: 14830 corp: 8/114b lim: 30 exec/s: 0 rss: 72Mb L: 21/30 MS: 1 EraseBytes- 00:07:14.963 [2024-11-30 15:44:22.782492] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:14.963 [2024-11-30 15:44:22.782862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:080800ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.963 [2024-11-30 15:44:22.782891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.963 #28 NEW cov: 12545 ft: 14882 corp: 9/125b lim: 30 exec/s: 0 rss: 72Mb L: 11/30 MS: 1 CrossOver- 00:07:14.963 [2024-11-30 15:44:22.852642] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:14.963 [2024-11-30 15:44:22.852989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:080883ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.963 [2024-11-30 15:44:22.853021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.963 #29 NEW cov: 12545 ft: 14931 corp: 10/136b lim: 30 exec/s: 0 rss: 73Mb L: 11/30 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:14.963 [2024-11-30 15:44:22.922688] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8196) > buf size (4096) 00:07:14.963 [2024-11-30 15:44:22.923118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.963 [2024-11-30 15:44:22.923147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.222 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:15.222 #30 NEW cov: 12568 ft: 15021 corp: 11/147b lim: 30 exec/s: 0 rss: 73Mb L: 11/30 MS: 1 ChangeBinInt- 00:07:15.222 [2024-11-30 15:44:22.973042] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.222 [2024-11-30 15:44:22.973219] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.222 [2024-11-30 15:44:22.973385] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000029ff 00:07:15.223 [2024-11-30 15:44:22.973551] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.223 [2024-11-30 15:44:22.973735] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.223 [2024-11-30 15:44:22.974087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:22.974116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.223 [2024-11-30 15:44:22.974233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1eff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:22.974249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.223 [2024-11-30 15:44:22.974366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:22.974382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.223 [2024-11-30 15:44:22.974502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:22.974521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.223 [2024-11-30 15:44:22.974638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:22.974654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.223 #31 NEW cov: 12568 ft: 15058 corp: 12/177b lim: 30 exec/s: 0 rss: 73Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:15.223 [2024-11-30 15:44:23.042872] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.223 [2024-11-30 15:44:23.043045] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.223 [2024-11-30 15:44:23.043390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:080883ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:23.043420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.223 [2024-11-30 15:44:23.043542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:5dff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:23.043564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.223 #32 NEW cov: 12568 ft: 15347 corp: 13/189b lim: 30 exec/s: 32 rss: 73Mb L: 12/30 MS: 1 InsertByte- 00:07:15.223 [2024-11-30 15:44:23.113055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:23.113092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.223 #34 NEW cov: 12585 ft: 15462 corp: 14/198b lim: 30 exec/s: 34 rss: 73Mb L: 9/30 MS: 2 ChangeBit-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:15.223 [2024-11-30 15:44:23.172791] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8196) > buf size (4096) 00:07:15.223 [2024-11-30 15:44:23.173155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.223 [2024-11-30 15:44:23.173191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.482 #40 NEW cov: 12585 ft: 15492 corp: 15/209b lim: 30 exec/s: 40 rss: 73Mb L: 11/30 MS: 1 ChangeByte- 00:07:15.482 [2024-11-30 15:44:23.263009] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300003a3a 00:07:15.482 [2024-11-30 15:44:23.263190] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200003a3a 00:07:15.482 [2024-11-30 15:44:23.263331] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:07:15.482 [2024-11-30 15:44:23.263670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.482 [2024-11-30 15:44:23.263707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.482 [2024-11-30 15:44:23.263845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3a3a023a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.482 [2024-11-30 15:44:23.263869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.482 [2024-11-30 15:44:23.264006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:3a3a023a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.482 [2024-11-30 15:44:23.264030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.482 #41 NEW cov: 12585 ft: 15617 corp: 16/232b lim: 30 exec/s: 41 rss: 73Mb L: 23/30 MS: 1 InsertRepeatedBytes- 00:07:15.482 [2024-11-30 15:44:23.333316] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.482 [2024-11-30 15:44:23.333501] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.482 [2024-11-30 15:44:23.333653] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.483 [2024-11-30 15:44:23.333813] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.483 [2024-11-30 15:44:23.333970] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.483 [2024-11-30 15:44:23.334406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.483 [2024-11-30 15:44:23.334459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.483 [2024-11-30 15:44:23.334615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.483 [2024-11-30 15:44:23.334648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.483 [2024-11-30 15:44:23.334795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.483 [2024-11-30 15:44:23.334826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.483 [2024-11-30 15:44:23.334983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.483 [2024-11-30 15:44:23.335016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.483 [2024-11-30 15:44:23.335169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ff0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.483 [2024-11-30 15:44:23.335200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.483 #42 NEW cov: 12585 ft: 15649 corp: 17/262b lim: 30 exec/s: 42 rss: 73Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:15.483 [2024-11-30 15:44:23.392936] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.483 [2024-11-30 15:44:23.393308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0cff83ac cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.483 [2024-11-30 15:44:23.393337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.483 #43 NEW cov: 12585 ft: 15689 corp: 18/269b lim: 30 exec/s: 43 rss: 73Mb L: 7/30 MS: 1 ChangeBit- 00:07:15.483 [2024-11-30 15:44:23.442950] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000acff 00:07:15.483 [2024-11-30 15:44:23.443295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.483 [2024-11-30 15:44:23.443326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.742 #44 NEW cov: 12585 ft: 15740 corp: 19/276b lim: 30 exec/s: 44 rss: 73Mb L: 7/30 MS: 1 ShuffleBytes- 00:07:15.742 [2024-11-30 15:44:23.493289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.742 [2024-11-30 15:44:23.493318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.742 #45 NEW cov: 12585 ft: 15760 corp: 20/285b lim: 30 exec/s: 45 rss: 73Mb L: 9/30 MS: 1 CopyPart- 00:07:15.742 [2024-11-30 15:44:23.562864] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.742 [2024-11-30 15:44:23.563034] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:15.742 [2024-11-30 15:44:23.563363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.742 [2024-11-30 15:44:23.563391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.742 [2024-11-30 15:44:23.563503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.742 [2024-11-30 15:44:23.563519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.742 #46 NEW cov: 12585 ft: 15782 corp: 21/297b lim: 30 exec/s: 46 rss: 73Mb L: 12/30 MS: 1 CrossOver- 00:07:15.742 [2024-11-30 15:44:23.613045] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:15.742 [2024-11-30 15:44:23.613411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.742 [2024-11-30 15:44:23.613443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.742 #47 NEW cov: 12585 ft: 15787 corp: 22/303b lim: 30 exec/s: 47 rss: 73Mb L: 6/30 MS: 1 EraseBytes- 00:07:15.742 [2024-11-30 15:44:23.663100] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (8196) > buf size (4096) 00:07:15.742 [2024-11-30 15:44:23.663456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.742 [2024-11-30 15:44:23.663486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.742 #48 NEW cov: 12585 ft: 15818 corp: 23/314b lim: 30 exec/s: 48 rss: 73Mb L: 11/30 MS: 1 PersAutoDict- DE: "\021\000\000\000"- 00:07:16.002 [2024-11-30 15:44:23.713178] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.002 [2024-11-30 15:44:23.713534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:080883ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.002 [2024-11-30 15:44:23.713564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.002 #49 NEW cov: 12585 ft: 15847 corp: 24/325b lim: 30 exec/s: 49 rss: 73Mb L: 11/30 MS: 1 ShuffleBytes- 00:07:16.002 [2024-11-30 15:44:23.763578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.002 [2024-11-30 15:44:23.763610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.003 #50 NEW cov: 12585 ft: 15868 corp: 25/335b lim: 30 exec/s: 50 rss: 73Mb L: 10/30 MS: 1 InsertByte- 00:07:16.003 [2024-11-30 15:44:23.813211] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.003 [2024-11-30 15:44:23.813572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0c7383ac cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.813603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.003 #51 NEW cov: 12585 ft: 15878 corp: 26/342b lim: 30 exec/s: 51 rss: 73Mb L: 7/30 MS: 1 ChangeByte- 00:07:16.003 [2024-11-30 15:44:23.883582] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.003 [2024-11-30 15:44:23.883753] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.003 [2024-11-30 15:44:23.883901] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000029ff 00:07:16.003 [2024-11-30 15:44:23.884046] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.003 [2024-11-30 15:44:23.884195] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.003 [2024-11-30 15:44:23.884527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.884554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.003 [2024-11-30 15:44:23.884678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1eff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.884696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.003 [2024-11-30 15:44:23.884810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.884825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.003 [2024-11-30 15:44:23.884940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.884960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.003 [2024-11-30 15:44:23.885074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ff2983ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.885092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:16.003 #52 NEW cov: 12585 ft: 15882 corp: 27/372b lim: 30 exec/s: 52 rss: 73Mb L: 30/30 MS: 1 CopyPart- 00:07:16.003 [2024-11-30 15:44:23.953587] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001f1f 00:07:16.003 [2024-11-30 15:44:23.953757] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.003 [2024-11-30 15:44:23.953909] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.003 [2024-11-30 15:44:23.954073] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.003 [2024-11-30 15:44:23.954423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.954455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.003 [2024-11-30 15:44:23.954569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1f1f831f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.954587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.003 [2024-11-30 15:44:23.954709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.954725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.003 [2024-11-30 15:44:23.954847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.003 [2024-11-30 15:44:23.954865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.263 #53 NEW cov: 12585 ft: 15901 corp: 28/398b lim: 30 exec/s: 53 rss: 73Mb L: 26/30 MS: 1 InsertRepeatedBytes- 00:07:16.263 [2024-11-30 15:44:24.023709] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001f1f 00:07:16.263 [2024-11-30 15:44:24.023861] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300001f1f 00:07:16.263 [2024-11-30 15:44:24.024010] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.263 [2024-11-30 15:44:24.024159] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.263 [2024-11-30 15:44:24.024311] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:16.263 [2024-11-30 15:44:24.024646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.263 [2024-11-30 15:44:24.024675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.263 [2024-11-30 15:44:24.024794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.263 [2024-11-30 15:44:24.024811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.263 [2024-11-30 15:44:24.024926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1fff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.263 [2024-11-30 15:44:24.024946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.263 [2024-11-30 15:44:24.025067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.263 [2024-11-30 15:44:24.025084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.263 [2024-11-30 15:44:24.025203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.263 [2024-11-30 15:44:24.025219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:16.263 #54 NEW cov: 12585 ft: 15934 corp: 29/428b lim: 30 exec/s: 27 rss: 73Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:16.263 #54 DONE cov: 12585 ft: 15934 corp: 29/428b lim: 30 exec/s: 27 rss: 73Mb 00:07:16.263 ###### Recommended dictionary. ###### 00:07:16.263 "\377\377\377\377" # Uses: 3 00:07:16.263 "\021\000\000\000" # Uses: 1 00:07:16.263 "\001\000\000\000\000\000\000\000" # Uses: 0 00:07:16.263 ###### End of recommended dictionary. ###### 00:07:16.263 Done 54 runs in 2 second(s) 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:16.263 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:16.264 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:16.264 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:16.264 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:16.264 15:44:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:16.264 [2024-11-30 15:44:24.215980] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:16.264 [2024-11-30 15:44:24.216062] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1711296 ] 00:07:16.833 [2024-11-30 15:44:24.530962] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:16.833 [2024-11-30 15:44:24.576984] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.833 [2024-11-30 15:44:24.594342] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.833 [2024-11-30 15:44:24.646681] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:16.833 [2024-11-30 15:44:24.663000] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:16.833 INFO: Running with entropic power schedule (0xFF, 100). 00:07:16.833 INFO: Seed: 2883974443 00:07:16.833 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:16.833 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:16.833 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:16.833 INFO: A corpus is not provided, starting from an empty corpus 00:07:16.833 #2 INITED exec/s: 0 rss: 64Mb 00:07:16.833 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:16.833 This may also happen if the target rejected all inputs we tried so far 00:07:16.833 [2024-11-30 15:44:24.718260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0f0a000f cdw11:00000f0f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.833 [2024-11-30 15:44:24.718289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.093 NEW_FUNC[1/716]: 0x4620b8 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:17.093 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:17.093 #12 NEW cov: 12274 ft: 12265 corp: 2/14b lim: 35 exec/s: 0 rss: 71Mb L: 13/13 MS: 5 ShuffleBytes-InsertRepeatedBytes-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:17.093 [2024-11-30 15:44:25.038723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.093 [2024-11-30 15:44:25.038757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.093 [2024-11-30 15:44:25.038833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.093 [2024-11-30 15:44:25.038848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.093 [2024-11-30 15:44:25.038906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.093 [2024-11-30 15:44:25.038920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.093 [2024-11-30 15:44:25.038977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.093 [2024-11-30 15:44:25.038992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.352 #15 NEW cov: 12387 ft: 13596 corp: 3/46b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 3 InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:17.352 [2024-11-30 15:44:25.078647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f1 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.078675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.078749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00f8 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.078774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.078832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.078849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.078906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.078920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.352 #16 NEW cov: 12393 ft: 13726 corp: 4/78b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:07:17.352 [2024-11-30 15:44:25.138279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.138306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.352 #18 NEW cov: 12478 ft: 14074 corp: 5/85b lim: 35 exec/s: 0 rss: 72Mb L: 7/32 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:17.352 [2024-11-30 15:44:25.178701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f1 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.178728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.178814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00f8ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.178829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.178888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.178902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.178959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.178973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.352 #19 NEW cov: 12478 ft: 14129 corp: 6/117b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ShuffleBytes- 00:07:17.352 [2024-11-30 15:44:25.238465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f1 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.238492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.238552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.238567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.352 #20 NEW cov: 12478 ft: 14476 corp: 7/135b lim: 35 exec/s: 0 rss: 72Mb L: 18/32 MS: 1 EraseBytes- 00:07:17.352 [2024-11-30 15:44:25.298710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.298736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.298796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.298811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.298871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff0026ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.298885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.352 [2024-11-30 15:44:25.298943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.352 [2024-11-30 15:44:25.298957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.611 #21 NEW cov: 12478 ft: 14576 corp: 8/167b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeByte- 00:07:17.611 [2024-11-30 15:44:25.338731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.338757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.611 [2024-11-30 15:44:25.338831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.338847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.611 [2024-11-30 15:44:25.338902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0010 cdw11:2600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.338916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.611 [2024-11-30 15:44:25.338974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.338988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.611 #22 NEW cov: 12478 ft: 14630 corp: 9/201b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 CMP- DE: "\001\020"- 00:07:17.611 [2024-11-30 15:44:25.398510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0f0a000f cdw11:00000f0f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.398536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.611 [2024-11-30 15:44:25.398613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0000003a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.398628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.611 #23 NEW cov: 12478 ft: 14655 corp: 10/215b lim: 35 exec/s: 0 rss: 72Mb L: 14/34 MS: 1 InsertByte- 00:07:17.611 [2024-11-30 15:44:25.458358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff93000a cdw11:af00c6b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.458383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.611 #24 NEW cov: 12478 ft: 14739 corp: 11/224b lim: 35 exec/s: 0 rss: 72Mb L: 9/34 MS: 1 CMP- DE: "\377\223\306\260\257n\231~"- 00:07:17.611 [2024-11-30 15:44:25.498816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff0020 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.498842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.611 [2024-11-30 15:44:25.498898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.498913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.611 [2024-11-30 15:44:25.498974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.498988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.611 [2024-11-30 15:44:25.499043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.499057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.611 #30 NEW cov: 12478 ft: 14846 corp: 12/256b lim: 35 exec/s: 0 rss: 72Mb L: 32/34 MS: 1 ChangeBinInt- 00:07:17.611 [2024-11-30 15:44:25.538435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0f0a000f cdw11:00000f0f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.611 [2024-11-30 15:44:25.538461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.611 #31 NEW cov: 12478 ft: 14931 corp: 13/269b lim: 35 exec/s: 0 rss: 72Mb L: 13/34 MS: 1 ShuffleBytes- 00:07:17.871 [2024-11-30 15:44:25.578897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.578923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.578983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.578998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.579055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff002b cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.579069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.579128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.579143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.871 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:17.871 #37 NEW cov: 12501 ft: 14943 corp: 14/301b lim: 35 exec/s: 0 rss: 72Mb L: 32/34 MS: 1 ChangeByte- 00:07:17.871 [2024-11-30 15:44:25.618494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffaf000a cdw11:9300b0c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.618519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.871 #38 NEW cov: 12501 ft: 15017 corp: 15/310b lim: 35 exec/s: 0 rss: 73Mb L: 9/34 MS: 1 ShuffleBytes- 00:07:17.871 [2024-11-30 15:44:25.678507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff93000a cdw11:6000c6b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.678536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.871 #39 NEW cov: 12501 ft: 15077 corp: 16/320b lim: 35 exec/s: 39 rss: 73Mb L: 10/34 MS: 1 InsertByte- 00:07:17.871 [2024-11-30 15:44:25.718985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.719012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.719097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.719112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.719172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff002bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.719186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.719245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.719259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.871 #40 NEW cov: 12501 ft: 15104 corp: 17/352b lim: 35 exec/s: 40 rss: 73Mb L: 32/34 MS: 1 CrossOver- 00:07:17.871 [2024-11-30 15:44:25.779034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f1 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.779060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.779135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fffb00ff cdw11:ff00f8ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.779150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.779209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.779223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.779283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.779297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.871 #41 NEW cov: 12501 ft: 15112 corp: 18/384b lim: 35 exec/s: 41 rss: 73Mb L: 32/34 MS: 1 ChangeBit- 00:07:17.871 [2024-11-30 15:44:25.819024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.819050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.819127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.819142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.819200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff002bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.819215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.871 [2024-11-30 15:44:25.819272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.871 [2024-11-30 15:44:25.819287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.878750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.878779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.878853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff2b00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.878868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.131 #43 NEW cov: 12501 ft: 15131 corp: 19/404b lim: 35 exec/s: 43 rss: 73Mb L: 20/34 MS: 2 EraseBytes-EraseBytes- 00:07:18.131 [2024-11-30 15:44:25.919061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.919087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.919148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00f7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.919162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.919221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff002bff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.919236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.919293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.919307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.131 #44 NEW cov: 12501 ft: 15150 corp: 20/436b lim: 35 exec/s: 44 rss: 73Mb L: 32/34 MS: 1 ChangeBit- 00:07:18.131 [2024-11-30 15:44:25.959101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.959126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.959184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.959198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.959255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.959270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.959326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:a0ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.959339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.131 #45 NEW cov: 12501 ft: 15163 corp: 21/469b lim: 35 exec/s: 45 rss: 73Mb L: 33/34 MS: 1 InsertByte- 00:07:18.131 [2024-11-30 15:44:25.998821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b9b900b9 cdw11:b900b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.998848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:25.998906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b9b900b9 cdw11:b900b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:25.998927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.131 #47 NEW cov: 12501 ft: 15192 corp: 22/484b lim: 35 exec/s: 47 rss: 73Mb L: 15/34 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:18.131 [2024-11-30 15:44:26.039129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff0020 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:26.039155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.131 [2024-11-30 15:44:26.039216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffe700ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.131 [2024-11-30 15:44:26.039230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.132 [2024-11-30 15:44:26.039288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.132 [2024-11-30 15:44:26.039301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.132 [2024-11-30 15:44:26.039358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.132 [2024-11-30 15:44:26.039372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.132 #48 NEW cov: 12501 ft: 15207 corp: 23/516b lim: 35 exec/s: 48 rss: 73Mb L: 32/34 MS: 1 ChangeByte- 00:07:18.391 [2024-11-30 15:44:26.098733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b9b900b9 cdw11:b900b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.098759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.391 #49 NEW cov: 12501 ft: 15225 corp: 24/527b lim: 35 exec/s: 49 rss: 73Mb L: 11/34 MS: 1 EraseBytes- 00:07:18.391 [2024-11-30 15:44:26.159010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00f1 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.159036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.391 [2024-11-30 15:44:26.159094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.159108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.391 [2024-11-30 15:44:26.159165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:930040c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.159179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.391 #50 NEW cov: 12501 ft: 15404 corp: 25/548b lim: 35 exec/s: 50 rss: 73Mb L: 21/34 MS: 1 CrossOver- 00:07:18.391 [2024-11-30 15:44:26.218780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b9b900b9 cdw11:b900b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.218806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.391 #51 NEW cov: 12501 ft: 15421 corp: 26/556b lim: 35 exec/s: 51 rss: 73Mb L: 8/34 MS: 1 EraseBytes- 00:07:18.391 [2024-11-30 15:44:26.279206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.279232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.391 [2024-11-30 15:44:26.279312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff2e00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.279327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.391 [2024-11-30 15:44:26.279384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff002b cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.279399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.391 [2024-11-30 15:44:26.279455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.279469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.391 #52 NEW cov: 12501 ft: 15443 corp: 27/588b lim: 35 exec/s: 52 rss: 73Mb L: 32/34 MS: 1 ChangeByte- 00:07:18.391 [2024-11-30 15:44:26.319282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.319308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.391 [2024-11-30 15:44:26.319385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.319400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.391 [2024-11-30 15:44:26.319461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff8d002b cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.391 [2024-11-30 15:44:26.319475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.392 [2024-11-30 15:44:26.319532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.392 [2024-11-30 15:44:26.319547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.392 #53 NEW cov: 12501 ft: 15537 corp: 28/620b lim: 35 exec/s: 53 rss: 73Mb L: 32/34 MS: 1 ChangeByte- 00:07:18.651 [2024-11-30 15:44:26.358846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffaf000a cdw11:9300b0c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.358873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.651 #54 NEW cov: 12501 ft: 15577 corp: 29/629b lim: 35 exec/s: 54 rss: 73Mb L: 9/34 MS: 1 CopyPart- 00:07:18.651 [2024-11-30 15:44:26.419428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff0020 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.419455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.651 [2024-11-30 15:44:26.419513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.419528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.651 [2024-11-30 15:44:26.419588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.419607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.651 [2024-11-30 15:44:26.419667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.419681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.651 [2024-11-30 15:44:26.419740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.419754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.651 #55 NEW cov: 12501 ft: 15619 corp: 30/664b lim: 35 exec/s: 55 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:07:18.651 [2024-11-30 15:44:26.458889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.458914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.651 #56 NEW cov: 12501 ft: 15627 corp: 31/671b lim: 35 exec/s: 56 rss: 73Mb L: 7/35 MS: 1 CopyPart- 00:07:18.651 [2024-11-30 15:44:26.518939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffaf000a cdw11:9300b0c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.518966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.651 #57 NEW cov: 12501 ft: 15636 corp: 32/682b lim: 35 exec/s: 57 rss: 74Mb L: 11/35 MS: 1 CopyPart- 00:07:18.651 [2024-11-30 15:44:26.578895] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:18.651 [2024-11-30 15:44:26.579160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0f0a000f cdw11:0f000110 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.579187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.651 [2024-11-30 15:44:26.579247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:003a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.651 [2024-11-30 15:44:26.579264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.941 #63 NEW cov: 12512 ft: 15727 corp: 33/698b lim: 35 exec/s: 63 rss: 74Mb L: 16/35 MS: 1 PersAutoDict- DE: "\001\020"- 00:07:18.941 [2024-11-30 15:44:26.639073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b9b900b9 cdw11:b900b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.941 [2024-11-30 15:44:26.639099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.941 [2024-11-30 15:44:26.639158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b9ff00b9 cdw11:b900b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.941 [2024-11-30 15:44:26.639172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.941 #64 NEW cov: 12512 ft: 15735 corp: 34/714b lim: 35 exec/s: 64 rss: 74Mb L: 16/35 MS: 1 InsertByte- 00:07:18.941 [2024-11-30 15:44:26.679227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b9b900b9 cdw11:b900b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.941 [2024-11-30 15:44:26.679254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.941 [2024-11-30 15:44:26.679315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b9b900b9 cdw11:b900b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.941 [2024-11-30 15:44:26.679329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.941 [2024-11-30 15:44:26.679392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffb900b9 cdw11:ff00b9b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.941 [2024-11-30 15:44:26.679406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.941 #65 NEW cov: 12512 ft: 15736 corp: 35/740b lim: 35 exec/s: 32 rss: 74Mb L: 26/35 MS: 1 CopyPart- 00:07:18.941 #65 DONE cov: 12512 ft: 15736 corp: 35/740b lim: 35 exec/s: 32 rss: 74Mb 00:07:18.941 ###### Recommended dictionary. ###### 00:07:18.941 "\001\020" # Uses: 1 00:07:18.941 "\377\223\306\260\257n\231~" # Uses: 0 00:07:18.941 ###### End of recommended dictionary. ###### 00:07:18.941 Done 65 runs in 2 second(s) 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:18.941 15:44:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:18.942 [2024-11-30 15:44:26.869174] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:18.942 [2024-11-30 15:44:26.869246] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1711805 ] 00:07:19.511 [2024-11-30 15:44:27.180091] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:19.511 [2024-11-30 15:44:27.225959] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.511 [2024-11-30 15:44:27.242284] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.511 [2024-11-30 15:44:27.294510] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:19.511 [2024-11-30 15:44:27.310847] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:19.511 INFO: Running with entropic power schedule (0xFF, 100). 00:07:19.511 INFO: Seed: 1234001215 00:07:19.511 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:19.511 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:19.511 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:19.511 INFO: A corpus is not provided, starting from an empty corpus 00:07:19.511 #2 INITED exec/s: 0 rss: 64Mb 00:07:19.511 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:19.511 This may also happen if the target rejected all inputs we tried so far 00:07:19.771 NEW_FUNC[1/704]: 0x463d98 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:19.771 NEW_FUNC[2/704]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:19.771 #4 NEW cov: 12150 ft: 12182 corp: 2/11b lim: 20 exec/s: 0 rss: 72Mb L: 10/10 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:19.771 NEW_FUNC[1/1]: 0x1fbe6a8 in thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1080 00:07:19.771 #5 NEW cov: 12296 ft: 12818 corp: 3/22b lim: 20 exec/s: 0 rss: 72Mb L: 11/11 MS: 1 InsertByte- 00:07:20.030 #6 NEW cov: 12302 ft: 13186 corp: 4/32b lim: 20 exec/s: 0 rss: 72Mb L: 10/11 MS: 1 ChangeBit- 00:07:20.030 #7 NEW cov: 12387 ft: 13376 corp: 5/43b lim: 20 exec/s: 0 rss: 72Mb L: 11/11 MS: 1 ChangeBit- 00:07:20.030 #8 NEW cov: 12387 ft: 13435 corp: 6/54b lim: 20 exec/s: 0 rss: 72Mb L: 11/11 MS: 1 ChangeByte- 00:07:20.030 #9 NEW cov: 12391 ft: 13917 corp: 7/68b lim: 20 exec/s: 0 rss: 72Mb L: 14/14 MS: 1 CopyPart- 00:07:20.030 #10 NEW cov: 12391 ft: 13961 corp: 8/79b lim: 20 exec/s: 0 rss: 72Mb L: 11/14 MS: 1 ChangeBinInt- 00:07:20.030 [2024-11-30 15:44:27.979686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.030 [2024-11-30 15:44:27.979726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.290 NEW_FUNC[1/17]: 0x13998e8 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3484 00:07:20.290 NEW_FUNC[2/17]: 0x139a468 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3426 00:07:20.290 #11 NEW cov: 12654 ft: 14472 corp: 9/98b lim: 20 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:20.290 [2024-11-30 15:44:28.039485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.290 [2024-11-30 15:44:28.039515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.290 #12 NEW cov: 12654 ft: 14517 corp: 10/109b lim: 20 exec/s: 0 rss: 72Mb L: 11/19 MS: 1 CMP- DE: "\000\004\000\000\000\000\000\000"- 00:07:20.290 #13 NEW cov: 12654 ft: 14553 corp: 11/119b lim: 20 exec/s: 0 rss: 72Mb L: 10/19 MS: 1 ChangeBit- 00:07:20.290 [2024-11-30 15:44:28.139612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.290 [2024-11-30 15:44:28.139655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.290 #14 NEW cov: 12654 ft: 14647 corp: 12/131b lim: 20 exec/s: 0 rss: 73Mb L: 12/19 MS: 1 InsertByte- 00:07:20.290 [2024-11-30 15:44:28.199576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.290 [2024-11-30 15:44:28.199608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.290 #15 NEW cov: 12654 ft: 14675 corp: 13/141b lim: 20 exec/s: 0 rss: 73Mb L: 10/19 MS: 1 PersAutoDict- DE: "\000\004\000\000\000\000\000\000"- 00:07:20.549 #16 NEW cov: 12654 ft: 14729 corp: 14/161b lim: 20 exec/s: 0 rss: 73Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:20.549 #17 NEW cov: 12654 ft: 14833 corp: 15/171b lim: 20 exec/s: 0 rss: 73Mb L: 10/20 MS: 1 ShuffleBytes- 00:07:20.549 #18 NEW cov: 12654 ft: 14850 corp: 16/182b lim: 20 exec/s: 18 rss: 73Mb L: 11/20 MS: 1 PersAutoDict- DE: "\000\004\000\000\000\000\000\000"- 00:07:20.549 #19 NEW cov: 12654 ft: 14886 corp: 17/193b lim: 20 exec/s: 19 rss: 73Mb L: 11/20 MS: 1 InsertByte- 00:07:20.549 #20 NEW cov: 12654 ft: 14923 corp: 18/204b lim: 20 exec/s: 20 rss: 73Mb L: 11/20 MS: 1 ShuffleBytes- 00:07:20.810 #21 NEW cov: 12654 ft: 14942 corp: 19/219b lim: 20 exec/s: 21 rss: 73Mb L: 15/20 MS: 1 InsertByte- 00:07:20.810 NEW_FUNC[1/2]: 0x1510b08 in nvmf_transport_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:784 00:07:20.810 NEW_FUNC[2/2]: 0x1538018 in nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3702 00:07:20.810 #22 NEW cov: 12710 ft: 15020 corp: 20/235b lim: 20 exec/s: 22 rss: 73Mb L: 16/20 MS: 1 CrossOver- 00:07:20.810 [2024-11-30 15:44:28.580077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.810 [2024-11-30 15:44:28.580108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.810 #23 NEW cov: 12710 ft: 15061 corp: 21/255b lim: 20 exec/s: 23 rss: 73Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:20.810 [2024-11-30 15:44:28.640101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:20.810 [2024-11-30 15:44:28.640130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.810 NEW_FUNC[1/1]: 0x15c02d8 in _nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3649 00:07:20.810 #24 NEW cov: 12737 ft: 15224 corp: 22/271b lim: 20 exec/s: 24 rss: 73Mb L: 16/20 MS: 1 CopyPart- 00:07:20.810 #25 NEW cov: 12737 ft: 15237 corp: 23/286b lim: 20 exec/s: 25 rss: 73Mb L: 15/20 MS: 1 CrossOver- 00:07:20.810 #26 NEW cov: 12737 ft: 15248 corp: 24/296b lim: 20 exec/s: 26 rss: 73Mb L: 10/20 MS: 1 CopyPart- 00:07:21.069 #27 NEW cov: 12737 ft: 15268 corp: 25/313b lim: 20 exec/s: 27 rss: 73Mb L: 17/20 MS: 1 InsertByte- 00:07:21.069 #28 NEW cov: 12737 ft: 15281 corp: 26/328b lim: 20 exec/s: 28 rss: 73Mb L: 15/20 MS: 1 ChangeBit- 00:07:21.069 [2024-11-30 15:44:28.900002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.069 [2024-11-30 15:44:28.900029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.069 #29 NEW cov: 12737 ft: 15298 corp: 27/339b lim: 20 exec/s: 29 rss: 73Mb L: 11/20 MS: 1 ChangeBinInt- 00:07:21.069 #30 NEW cov: 12737 ft: 15354 corp: 28/357b lim: 20 exec/s: 30 rss: 73Mb L: 18/20 MS: 1 CMP- DE: "\377\223\306\262\344\235n\366"- 00:07:21.069 [2024-11-30 15:44:28.980257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.069 [2024-11-30 15:44:28.980284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.069 #31 NEW cov: 12737 ft: 15360 corp: 29/377b lim: 20 exec/s: 31 rss: 73Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:21.329 #32 NEW cov: 12737 ft: 15383 corp: 30/391b lim: 20 exec/s: 32 rss: 73Mb L: 14/20 MS: 1 ChangeByte- 00:07:21.329 #33 NEW cov: 12737 ft: 15409 corp: 31/400b lim: 20 exec/s: 33 rss: 73Mb L: 9/20 MS: 1 EraseBytes- 00:07:21.329 #34 NEW cov: 12737 ft: 15437 corp: 32/415b lim: 20 exec/s: 34 rss: 74Mb L: 15/20 MS: 1 PersAutoDict- DE: "\377\223\306\262\344\235n\366"- 00:07:21.329 #35 NEW cov: 12737 ft: 15465 corp: 33/426b lim: 20 exec/s: 35 rss: 74Mb L: 11/20 MS: 1 ChangeBinInt- 00:07:21.329 #36 NEW cov: 12737 ft: 15480 corp: 34/446b lim: 20 exec/s: 36 rss: 74Mb L: 20/20 MS: 1 CrossOver- 00:07:21.589 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:21.589 #37 NEW cov: 12760 ft: 15536 corp: 35/466b lim: 20 exec/s: 37 rss: 74Mb L: 20/20 MS: 1 CopyPart- 00:07:21.589 #38 NEW cov: 12760 ft: 15550 corp: 36/478b lim: 20 exec/s: 19 rss: 74Mb L: 12/20 MS: 1 InsertByte- 00:07:21.589 #38 DONE cov: 12760 ft: 15550 corp: 36/478b lim: 20 exec/s: 19 rss: 74Mb 00:07:21.589 ###### Recommended dictionary. ###### 00:07:21.589 "\000\004\000\000\000\000\000\000" # Uses: 2 00:07:21.589 "\377\223\306\262\344\235n\366" # Uses: 1 00:07:21.589 ###### End of recommended dictionary. ###### 00:07:21.589 Done 38 runs in 2 second(s) 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:21.589 15:44:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:07:21.589 [2024-11-30 15:44:29.489641] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:21.589 [2024-11-30 15:44:29.489716] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1712128 ] 00:07:21.880 [2024-11-30 15:44:29.808484] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:22.140 [2024-11-30 15:44:29.854934] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.140 [2024-11-30 15:44:29.872924] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.140 [2024-11-30 15:44:29.925519] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:22.140 [2024-11-30 15:44:29.941847] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:22.140 INFO: Running with entropic power schedule (0xFF, 100). 00:07:22.140 INFO: Seed: 3866033297 00:07:22.140 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:22.140 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:22.140 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:22.140 INFO: A corpus is not provided, starting from an empty corpus 00:07:22.140 #2 INITED exec/s: 0 rss: 64Mb 00:07:22.140 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:22.140 This may also happen if the target rejected all inputs we tried so far 00:07:22.140 [2024-11-30 15:44:29.987067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d7db56 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.140 [2024-11-30 15:44:29.987098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.399 NEW_FUNC[1/717]: 0x464e98 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:22.399 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:22.399 #20 NEW cov: 12296 ft: 12284 corp: 2/12b lim: 35 exec/s: 0 rss: 72Mb L: 11/11 MS: 3 InsertByte-CrossOver-CMP- DE: "\333V\001\327\263\306\224\000"- 00:07:22.399 [2024-11-30 15:44:30.307262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:56010adb cdw11:d7b30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.399 [2024-11-30 15:44:30.307299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.399 #21 NEW cov: 12409 ft: 13020 corp: 3/21b lim: 35 exec/s: 0 rss: 72Mb L: 9/11 MS: 1 PersAutoDict- DE: "\333V\001\327\263\306\224\000"- 00:07:22.399 [2024-11-30 15:44:30.347143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.400 [2024-11-30 15:44:30.347170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.659 #25 NEW cov: 12415 ft: 13286 corp: 4/30b lim: 35 exec/s: 0 rss: 72Mb L: 9/11 MS: 4 ChangeByte-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:22.659 [2024-11-30 15:44:30.387225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d72356 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.659 [2024-11-30 15:44:30.387251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.659 #26 NEW cov: 12500 ft: 13544 corp: 5/41b lim: 35 exec/s: 0 rss: 72Mb L: 11/11 MS: 1 ChangeBinInt- 00:07:22.659 [2024-11-30 15:44:30.447198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01e32356 cdw11:d7b30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.659 [2024-11-30 15:44:30.447225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.659 #32 NEW cov: 12500 ft: 13730 corp: 6/53b lim: 35 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 InsertByte- 00:07:22.659 [2024-11-30 15:44:30.507629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.659 [2024-11-30 15:44:30.507664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.659 [2024-11-30 15:44:30.507736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:61610061 cdw11:61610002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.659 [2024-11-30 15:44:30.507751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.659 [2024-11-30 15:44:30.507808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:61616161 cdw11:61610002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.660 [2024-11-30 15:44:30.507823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.660 #38 NEW cov: 12500 ft: 14530 corp: 7/75b lim: 35 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:22.660 [2024-11-30 15:44:30.567261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d7db56 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.660 [2024-11-30 15:44:30.567287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.660 #42 NEW cov: 12500 ft: 14616 corp: 8/85b lim: 35 exec/s: 0 rss: 72Mb L: 10/22 MS: 4 ChangeBinInt-ShuffleBytes-InsertByte-PersAutoDict- DE: "\333V\001\327\263\306\224\000"- 00:07:22.660 [2024-11-30 15:44:30.607284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d7db56 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.660 [2024-11-30 15:44:30.607309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.920 #43 NEW cov: 12500 ft: 14646 corp: 9/95b lim: 35 exec/s: 0 rss: 72Mb L: 10/22 MS: 1 PersAutoDict- DE: "\333V\001\327\263\306\224\000"- 00:07:22.920 [2024-11-30 15:44:30.667329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c694d7b3 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.920 [2024-11-30 15:44:30.667354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.920 #44 NEW cov: 12500 ft: 14677 corp: 10/102b lim: 35 exec/s: 0 rss: 72Mb L: 7/22 MS: 1 EraseBytes- 00:07:22.920 [2024-11-30 15:44:30.707324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d7b35601 cdw11:c6940000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.920 [2024-11-30 15:44:30.707349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.920 #45 NEW cov: 12500 ft: 14712 corp: 11/111b lim: 35 exec/s: 0 rss: 72Mb L: 9/22 MS: 1 CrossOver- 00:07:22.920 [2024-11-30 15:44:30.767342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.920 [2024-11-30 15:44:30.767366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.920 #46 NEW cov: 12500 ft: 14725 corp: 12/120b lim: 35 exec/s: 0 rss: 72Mb L: 9/22 MS: 1 ChangeASCIIInt- 00:07:22.920 [2024-11-30 15:44:30.807395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d70100 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.920 [2024-11-30 15:44:30.807421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.920 #47 NEW cov: 12500 ft: 14772 corp: 13/130b lim: 35 exec/s: 0 rss: 72Mb L: 10/22 MS: 1 CMP- DE: "\001\000"- 00:07:22.920 [2024-11-30 15:44:30.847811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d7db56 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.920 [2024-11-30 15:44:30.847836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.920 [2024-11-30 15:44:30.847911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:bfbf00bf cdw11:bfbf0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.920 [2024-11-30 15:44:30.847926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.920 [2024-11-30 15:44:30.847983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bfbfbfbf cdw11:bfbf0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.920 [2024-11-30 15:44:30.847996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.920 #48 NEW cov: 12500 ft: 14803 corp: 14/152b lim: 35 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:23.180 [2024-11-30 15:44:30.887481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d7b3db01 cdw11:c6940000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.180 [2024-11-30 15:44:30.887507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.180 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:23.180 #49 NEW cov: 12523 ft: 14907 corp: 15/162b lim: 35 exec/s: 0 rss: 72Mb L: 10/22 MS: 1 EraseBytes- 00:07:23.180 [2024-11-30 15:44:30.927462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d7db56 cdw11:b0c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.180 [2024-11-30 15:44:30.927488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.180 #50 NEW cov: 12523 ft: 15100 corp: 16/172b lim: 35 exec/s: 0 rss: 72Mb L: 10/22 MS: 1 ChangeByte- 00:07:23.180 [2024-11-30 15:44:30.987883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d70100 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.180 [2024-11-30 15:44:30.987909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.180 [2024-11-30 15:44:30.987985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:25500001 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.180 [2024-11-30 15:44:30.987999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.180 [2024-11-30 15:44:30.988058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.180 [2024-11-30 15:44:30.988071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.180 #51 NEW cov: 12523 ft: 15125 corp: 17/194b lim: 35 exec/s: 51 rss: 72Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:23.180 [2024-11-30 15:44:31.047491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3dc6d7b3 cdw11:94000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.180 [2024-11-30 15:44:31.047518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.180 #52 NEW cov: 12523 ft: 15177 corp: 18/202b lim: 35 exec/s: 52 rss: 72Mb L: 8/22 MS: 1 InsertByte- 00:07:23.180 [2024-11-30 15:44:31.107549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d7b32356 cdw11:c6940000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.180 [2024-11-30 15:44:31.107576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.180 #53 NEW cov: 12523 ft: 15194 corp: 19/212b lim: 35 exec/s: 53 rss: 72Mb L: 10/22 MS: 1 EraseBytes- 00:07:23.438 [2024-11-30 15:44:31.147586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3dc6d7b3 cdw11:94000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.438 [2024-11-30 15:44:31.147620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.438 #54 NEW cov: 12523 ft: 15266 corp: 20/220b lim: 35 exec/s: 54 rss: 73Mb L: 8/22 MS: 1 CopyPart- 00:07:23.438 [2024-11-30 15:44:31.207583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00b35601 cdw11:c6940001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.439 [2024-11-30 15:44:31.207617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.439 #55 NEW cov: 12523 ft: 15281 corp: 21/229b lim: 35 exec/s: 55 rss: 73Mb L: 9/22 MS: 1 ShuffleBytes- 00:07:23.439 [2024-11-30 15:44:31.267624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a2356 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.439 [2024-11-30 15:44:31.267650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.439 #56 NEW cov: 12523 ft: 15292 corp: 22/240b lim: 35 exec/s: 56 rss: 73Mb L: 11/22 MS: 1 CopyPart- 00:07:23.439 [2024-11-30 15:44:31.307992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d70100 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.439 [2024-11-30 15:44:31.308019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.439 [2024-11-30 15:44:31.308083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:01256d00 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.439 [2024-11-30 15:44:31.308098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.439 [2024-11-30 15:44:31.308157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.439 [2024-11-30 15:44:31.308170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.439 #57 NEW cov: 12523 ft: 15300 corp: 23/263b lim: 35 exec/s: 57 rss: 73Mb L: 23/23 MS: 1 InsertByte- 00:07:23.439 [2024-11-30 15:44:31.367665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d73b56 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.439 [2024-11-30 15:44:31.367690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.439 #58 NEW cov: 12523 ft: 15347 corp: 24/273b lim: 35 exec/s: 58 rss: 73Mb L: 10/23 MS: 1 ChangeByte- 00:07:23.696 [2024-11-30 15:44:31.407722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d7b35601 cdw11:c6240000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.407748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.696 #59 NEW cov: 12523 ft: 15394 corp: 25/282b lim: 35 exec/s: 59 rss: 73Mb L: 9/23 MS: 1 ChangeByte- 00:07:23.696 [2024-11-30 15:44:31.448024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:df000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.448049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.696 [2024-11-30 15:44:31.448123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:61610000 cdw11:61610002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.448138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.696 [2024-11-30 15:44:31.448198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:61616161 cdw11:61610002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.448211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.696 #60 NEW cov: 12523 ft: 15397 corp: 26/305b lim: 35 exec/s: 60 rss: 73Mb L: 23/23 MS: 1 InsertByte- 00:07:23.696 [2024-11-30 15:44:31.507712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5601db41 cdw11:d7b30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.507738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.696 #61 NEW cov: 12523 ft: 15398 corp: 27/317b lim: 35 exec/s: 61 rss: 73Mb L: 12/23 MS: 1 InsertByte- 00:07:23.696 [2024-11-30 15:44:31.547737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00b35641 cdw11:c6940001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.547762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.696 #62 NEW cov: 12523 ft: 15417 corp: 28/326b lim: 35 exec/s: 62 rss: 73Mb L: 9/23 MS: 1 ChangeBit- 00:07:23.696 [2024-11-30 15:44:31.607786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:3dc6d7b3 cdw11:94000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.607813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.696 #63 NEW cov: 12523 ft: 15426 corp: 29/334b lim: 35 exec/s: 63 rss: 73Mb L: 8/23 MS: 1 ChangeByte- 00:07:23.696 [2024-11-30 15:44:31.648179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:df000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.648205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.696 [2024-11-30 15:44:31.648262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:61610000 cdw11:61610002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.648276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.696 [2024-11-30 15:44:31.648333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:61616161 cdw11:61610002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.696 [2024-11-30 15:44:31.648347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.954 #64 NEW cov: 12523 ft: 15431 corp: 30/357b lim: 35 exec/s: 64 rss: 73Mb L: 23/23 MS: 1 ChangeBinInt- 00:07:23.954 [2024-11-30 15:44:31.707837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:57b35601 cdw11:c6940000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.954 [2024-11-30 15:44:31.707862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.954 #65 NEW cov: 12523 ft: 15442 corp: 31/366b lim: 35 exec/s: 65 rss: 73Mb L: 9/23 MS: 1 ChangeBit- 00:07:23.954 [2024-11-30 15:44:31.747845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c601d7b3 cdw11:00940000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.954 [2024-11-30 15:44:31.747870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.954 #66 NEW cov: 12523 ft: 15476 corp: 32/375b lim: 35 exec/s: 66 rss: 73Mb L: 9/23 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:23.954 [2024-11-30 15:44:31.788100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a2356 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.954 [2024-11-30 15:44:31.788126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.954 [2024-11-30 15:44:31.788201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a0a00ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.954 [2024-11-30 15:44:31.788216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.954 #67 NEW cov: 12523 ft: 15689 corp: 33/393b lim: 35 exec/s: 67 rss: 73Mb L: 18/23 MS: 1 InsertRepeatedBytes- 00:07:23.954 [2024-11-30 15:44:31.847933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01d7db56 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.954 [2024-11-30 15:44:31.847958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.954 #68 NEW cov: 12523 ft: 15756 corp: 34/402b lim: 35 exec/s: 68 rss: 73Mb L: 9/23 MS: 1 EraseBytes- 00:07:23.954 [2024-11-30 15:44:31.887903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0000f8 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.954 [2024-11-30 15:44:31.887927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.954 #69 NEW cov: 12523 ft: 15782 corp: 35/411b lim: 35 exec/s: 69 rss: 73Mb L: 9/23 MS: 1 ChangeBinInt- 00:07:24.214 [2024-11-30 15:44:31.928253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00230002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.214 [2024-11-30 15:44:31.928278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.214 [2024-11-30 15:44:31.928339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b3c60a0a cdw11:00940000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.214 [2024-11-30 15:44:31.928352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.214 [2024-11-30 15:44:31.928410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0a00ff0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.214 [2024-11-30 15:44:31.928425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.214 #70 NEW cov: 12523 ft: 15788 corp: 36/438b lim: 35 exec/s: 70 rss: 73Mb L: 27/27 MS: 1 CrossOver- 00:07:24.214 [2024-11-30 15:44:31.968279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00d70101 cdw11:b3c60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.214 [2024-11-30 15:44:31.968305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.214 [2024-11-30 15:44:31.968381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:01256d00 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.214 [2024-11-30 15:44:31.968395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.214 [2024-11-30 15:44:31.968452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.214 [2024-11-30 15:44:31.968466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.214 #71 NEW cov: 12523 ft: 15797 corp: 37/461b lim: 35 exec/s: 35 rss: 73Mb L: 23/27 MS: 1 ShuffleBytes- 00:07:24.214 #71 DONE cov: 12523 ft: 15797 corp: 37/461b lim: 35 exec/s: 35 rss: 73Mb 00:07:24.214 ###### Recommended dictionary. ###### 00:07:24.214 "\333V\001\327\263\306\224\000" # Uses: 3 00:07:24.214 "\001\000" # Uses: 1 00:07:24.214 ###### End of recommended dictionary. ###### 00:07:24.214 Done 71 runs in 2 second(s) 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:24.214 15:44:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:07:24.214 [2024-11-30 15:44:32.157869] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:24.214 [2024-11-30 15:44:32.157937] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1712655 ] 00:07:24.782 [2024-11-30 15:44:32.471836] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:24.782 [2024-11-30 15:44:32.518437] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.782 [2024-11-30 15:44:32.539468] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.782 [2024-11-30 15:44:32.591835] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:24.782 [2024-11-30 15:44:32.608145] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:24.782 INFO: Running with entropic power schedule (0xFF, 100). 00:07:24.782 INFO: Seed: 2238048913 00:07:24.782 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:24.782 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:24.782 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:24.782 INFO: A corpus is not provided, starting from an empty corpus 00:07:24.782 #2 INITED exec/s: 0 rss: 65Mb 00:07:24.782 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:24.782 This may also happen if the target rejected all inputs we tried so far 00:07:24.782 [2024-11-30 15:44:32.673744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7e7e0a7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.783 [2024-11-30 15:44:32.673773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.783 [2024-11-30 15:44:32.673827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.783 [2024-11-30 15:44:32.673841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.783 [2024-11-30 15:44:32.673894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.783 [2024-11-30 15:44:32.673907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.041 NEW_FUNC[1/717]: 0x467038 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:25.041 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:25.041 #8 NEW cov: 12307 ft: 12308 corp: 2/29b lim: 45 exec/s: 0 rss: 72Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:25.041 [2024-11-30 15:44:32.994134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.041 [2024-11-30 15:44:32.994187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.041 [2024-11-30 15:44:32.994275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.041 [2024-11-30 15:44:32.994306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.041 [2024-11-30 15:44:32.994382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.041 [2024-11-30 15:44:32.994406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.041 [2024-11-30 15:44:32.994482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.041 [2024-11-30 15:44:32.994506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.300 #24 NEW cov: 12420 ft: 13296 corp: 3/72b lim: 45 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:07:25.300 [2024-11-30 15:44:33.063875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.063902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.300 [2024-11-30 15:44:33.063957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.063971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.300 [2024-11-30 15:44:33.064025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.064038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.300 [2024-11-30 15:44:33.064093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e6b7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.064107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.300 #30 NEW cov: 12426 ft: 13516 corp: 4/115b lim: 45 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 ChangeByte- 00:07:25.300 [2024-11-30 15:44:33.123894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.123923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.300 [2024-11-30 15:44:33.123993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.124008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.300 [2024-11-30 15:44:33.124061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.124075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.300 [2024-11-30 15:44:33.124128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.124142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.300 #31 NEW cov: 12511 ft: 13781 corp: 5/158b lim: 45 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 ChangeByte- 00:07:25.300 [2024-11-30 15:44:33.163929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.163958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.300 [2024-11-30 15:44:33.164013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.300 [2024-11-30 15:44:33.164027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.300 [2024-11-30 15:44:33.164082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.301 [2024-11-30 15:44:33.164096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.301 [2024-11-30 15:44:33.164148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.301 [2024-11-30 15:44:33.164160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.301 #32 NEW cov: 12511 ft: 13929 corp: 6/201b lim: 45 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 ChangeBinInt- 00:07:25.301 [2024-11-30 15:44:33.223426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.301 [2024-11-30 15:44:33.223451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.301 #35 NEW cov: 12511 ft: 14778 corp: 7/213b lim: 45 exec/s: 0 rss: 72Mb L: 12/43 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:07:25.301 [2024-11-30 15:44:33.263984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.301 [2024-11-30 15:44:33.264010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.301 [2024-11-30 15:44:33.264066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e260 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.301 [2024-11-30 15:44:33.264080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.301 [2024-11-30 15:44:33.264134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.301 [2024-11-30 15:44:33.264148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.301 [2024-11-30 15:44:33.264201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.301 [2024-11-30 15:44:33.264214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.559 #36 NEW cov: 12511 ft: 14846 corp: 8/257b lim: 45 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 InsertByte- 00:07:25.559 [2024-11-30 15:44:33.303992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.559 [2024-11-30 15:44:33.304018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.559 [2024-11-30 15:44:33.304090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.559 [2024-11-30 15:44:33.304104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.559 [2024-11-30 15:44:33.304159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.304176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.304229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.304242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.560 #37 NEW cov: 12511 ft: 14971 corp: 9/300b lim: 45 exec/s: 0 rss: 73Mb L: 43/44 MS: 1 ChangeByte- 00:07:25.560 [2024-11-30 15:44:33.363975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.364001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.364073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.364087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.364140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.364153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.364206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e0a7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.364220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.560 #38 NEW cov: 12511 ft: 15071 corp: 10/344b lim: 45 exec/s: 0 rss: 73Mb L: 44/44 MS: 1 CrossOver- 00:07:25.560 [2024-11-30 15:44:33.404018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.404044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.404099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.404113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.404166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.404179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.404229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.404243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.560 #39 NEW cov: 12511 ft: 15116 corp: 11/387b lim: 45 exec/s: 0 rss: 73Mb L: 43/44 MS: 1 ShuffleBytes- 00:07:25.560 [2024-11-30 15:44:33.464269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.464295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.464350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.464367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.464421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.464434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.464488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.464501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.560 #40 NEW cov: 12511 ft: 15266 corp: 12/426b lim: 45 exec/s: 0 rss: 73Mb L: 39/44 MS: 1 CrossOver- 00:07:25.560 [2024-11-30 15:44:33.504028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.504054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.504110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.504124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.504179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e0a7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.504193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.560 [2024-11-30 15:44:33.504247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.560 [2024-11-30 15:44:33.504260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.819 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:25.819 #41 NEW cov: 12534 ft: 15284 corp: 13/469b lim: 45 exec/s: 0 rss: 73Mb L: 43/44 MS: 1 ShuffleBytes- 00:07:25.819 [2024-11-30 15:44:33.563931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.563956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.564028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e27e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.564041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.564096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.564110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.820 #42 NEW cov: 12534 ft: 15308 corp: 14/502b lim: 45 exec/s: 0 rss: 73Mb L: 33/44 MS: 1 EraseBytes- 00:07:25.820 [2024-11-30 15:44:33.604090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.604116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.604187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b5e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.604204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.604259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.604273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.604326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.604339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.820 #48 NEW cov: 12534 ft: 15334 corp: 15/546b lim: 45 exec/s: 0 rss: 73Mb L: 44/44 MS: 1 InsertByte- 00:07:25.820 [2024-11-30 15:44:33.644147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.644172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.644228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.644242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.644294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.644307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.644360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.644373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.820 #49 NEW cov: 12534 ft: 15379 corp: 16/589b lim: 45 exec/s: 49 rss: 73Mb L: 43/44 MS: 1 ChangeBit- 00:07:25.820 [2024-11-30 15:44:33.683961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.683987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.684042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.684055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.684109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.684123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.820 #50 NEW cov: 12534 ft: 15385 corp: 17/623b lim: 45 exec/s: 50 rss: 73Mb L: 34/44 MS: 1 EraseBytes- 00:07:25.820 [2024-11-30 15:44:33.724111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.724137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.724208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.724226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.724278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.724292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.820 [2024-11-30 15:44:33.724347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:6b7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.820 [2024-11-30 15:44:33.724360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.820 #51 NEW cov: 12534 ft: 15389 corp: 18/667b lim: 45 exec/s: 51 rss: 73Mb L: 44/44 MS: 1 CrossOver- 00:07:25.820 [2024-11-30 15:44:33.784188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.784214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.784272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2eee2e2 cdw11:eeee0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.784286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.784338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e27eeee2 cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.784352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.784405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.784419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.080 #52 NEW cov: 12534 ft: 15412 corp: 19/707b lim: 45 exec/s: 52 rss: 73Mb L: 40/44 MS: 1 InsertRepeatedBytes- 00:07:26.080 [2024-11-30 15:44:33.844226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.844253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.844309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.844323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.844377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.844391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.844446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.844460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.080 #53 NEW cov: 12534 ft: 15420 corp: 20/750b lim: 45 exec/s: 53 rss: 73Mb L: 43/44 MS: 1 ChangeBit- 00:07:26.080 [2024-11-30 15:44:33.883867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.883896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.883951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b5e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.883965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.080 #54 NEW cov: 12534 ft: 15729 corp: 21/776b lim: 45 exec/s: 54 rss: 73Mb L: 26/44 MS: 1 CrossOver- 00:07:26.080 [2024-11-30 15:44:33.944081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.944107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.944178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b5e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.944193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:33.944248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e27ee2e2 cdw11:e2e20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:33.944261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.080 #55 NEW cov: 12534 ft: 15748 corp: 22/803b lim: 45 exec/s: 55 rss: 73Mb L: 27/44 MS: 1 InsertByte- 00:07:26.080 [2024-11-30 15:44:34.004271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21eea cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:34.004296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:34.004351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:34.004365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:34.004417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:34.004430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:34.004482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:34.004494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.080 #56 NEW cov: 12534 ft: 15765 corp: 23/846b lim: 45 exec/s: 56 rss: 73Mb L: 43/44 MS: 1 ChangeBit- 00:07:26.080 [2024-11-30 15:44:34.044339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:34.044365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.080 [2024-11-30 15:44:34.044420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.080 [2024-11-30 15:44:34.044434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.044490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.044506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.044559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.044572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.340 #57 NEW cov: 12534 ft: 15811 corp: 24/889b lim: 45 exec/s: 57 rss: 73Mb L: 43/44 MS: 1 ChangeByte- 00:07:26.340 [2024-11-30 15:44:34.084111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.084136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.084206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e21e cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.084221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.084275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.084288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.340 #58 NEW cov: 12534 ft: 15871 corp: 25/924b lim: 45 exec/s: 58 rss: 73Mb L: 35/44 MS: 1 CrossOver- 00:07:26.340 [2024-11-30 15:44:34.144303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.144329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.144384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b5e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.144398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.144453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e2e20ae2 cdw11:e27e0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.144467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.144521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e9e9e9e9 cdw11:e2e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.144535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.340 #59 NEW cov: 12534 ft: 15899 corp: 26/962b lim: 45 exec/s: 59 rss: 73Mb L: 38/44 MS: 1 CrossOver- 00:07:26.340 [2024-11-30 15:44:34.184366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.184392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.184448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.184462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.184516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.184533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.184587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.184605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.340 #60 NEW cov: 12534 ft: 15915 corp: 27/1005b lim: 45 exec/s: 60 rss: 73Mb L: 43/44 MS: 1 ChangeByte- 00:07:26.340 [2024-11-30 15:44:34.224358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.340 [2024-11-30 15:44:34.224384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.340 [2024-11-30 15:44:34.224438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b5e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.341 [2024-11-30 15:44:34.224452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.341 [2024-11-30 15:44:34.224508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e2e20ae2 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.341 [2024-11-30 15:44:34.224521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.341 [2024-11-30 15:44:34.224574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e9e9e9e9 cdw11:e9e90007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.341 [2024-11-30 15:44:34.224587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.341 #61 NEW cov: 12534 ft: 15925 corp: 28/1043b lim: 45 exec/s: 61 rss: 74Mb L: 38/44 MS: 1 CrossOver- 00:07:26.341 [2024-11-30 15:44:34.284564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:7e7e0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.341 [2024-11-30 15:44:34.284589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.341 [2024-11-30 15:44:34.284664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.341 [2024-11-30 15:44:34.284689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.341 [2024-11-30 15:44:34.284743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.341 [2024-11-30 15:44:34.284756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.341 [2024-11-30 15:44:34.284808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.341 [2024-11-30 15:44:34.284821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.341 [2024-11-30 15:44:34.284875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.341 [2024-11-30 15:44:34.284888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.600 #67 NEW cov: 12534 ft: 15979 corp: 29/1088b lim: 45 exec/s: 67 rss: 74Mb L: 45/45 MS: 1 CopyPart- 00:07:26.600 [2024-11-30 15:44:34.324421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.600 [2024-11-30 15:44:34.324450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.600 [2024-11-30 15:44:34.324505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.600 [2024-11-30 15:44:34.324518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.600 [2024-11-30 15:44:34.324569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.600 [2024-11-30 15:44:34.324582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.600 [2024-11-30 15:44:34.324638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.600 [2024-11-30 15:44:34.324652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.600 #68 NEW cov: 12534 ft: 15992 corp: 30/1132b lim: 45 exec/s: 68 rss: 74Mb L: 44/45 MS: 1 InsertByte- 00:07:26.600 [2024-11-30 15:44:34.384460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.600 [2024-11-30 15:44:34.384487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.600 [2024-11-30 15:44:34.384540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.600 [2024-11-30 15:44:34.384554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.384613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.384627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.384680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.384693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.601 #69 NEW cov: 12534 ft: 15995 corp: 31/1172b lim: 45 exec/s: 69 rss: 74Mb L: 40/45 MS: 1 InsertByte- 00:07:26.601 [2024-11-30 15:44:34.444453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.444478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.444551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.444565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.444622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.444636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.444691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.444704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.601 #70 NEW cov: 12534 ft: 15996 corp: 32/1216b lim: 45 exec/s: 70 rss: 74Mb L: 44/45 MS: 1 ShuffleBytes- 00:07:26.601 [2024-11-30 15:44:34.504499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.504524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.504610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.504624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.504695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.504707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.504761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.504774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.601 #71 NEW cov: 12534 ft: 16017 corp: 33/1255b lim: 45 exec/s: 71 rss: 74Mb L: 39/45 MS: 1 ChangeBinInt- 00:07:26.601 [2024-11-30 15:44:34.544603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.544630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.544703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e27ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.544717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.544771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.544784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.601 [2024-11-30 15:44:34.544841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e7e cdw11:6b7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.601 [2024-11-30 15:44:34.544855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.861 #72 NEW cov: 12534 ft: 16048 corp: 34/1299b lim: 45 exec/s: 72 rss: 74Mb L: 44/45 MS: 1 CrossOver- 00:07:26.861 [2024-11-30 15:44:34.584592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.861 [2024-11-30 15:44:34.584623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.861 [2024-11-30 15:44:34.584678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.861 [2024-11-30 15:44:34.584691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.861 [2024-11-30 15:44:34.584744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:7e7e7e7e cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.861 [2024-11-30 15:44:34.584757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.861 [2024-11-30 15:44:34.584814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7e7e7e3b cdw11:7e7e0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.861 [2024-11-30 15:44:34.584827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.861 #73 NEW cov: 12534 ft: 16059 corp: 35/1342b lim: 45 exec/s: 73 rss: 74Mb L: 43/45 MS: 1 ChangeByte- 00:07:26.861 [2024-11-30 15:44:34.624475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e21ee2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.861 [2024-11-30 15:44:34.624500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.861 [2024-11-30 15:44:34.624555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e21e cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.861 [2024-11-30 15:44:34.624569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.861 [2024-11-30 15:44:34.624626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.861 [2024-11-30 15:44:34.624640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.861 #74 NEW cov: 12534 ft: 16080 corp: 36/1377b lim: 45 exec/s: 37 rss: 74Mb L: 35/45 MS: 1 ShuffleBytes- 00:07:26.861 #74 DONE cov: 12534 ft: 16080 corp: 36/1377b lim: 45 exec/s: 37 rss: 74Mb 00:07:26.861 Done 74 runs in 2 second(s) 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:26.861 15:44:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:07:26.862 [2024-11-30 15:44:34.811156] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:26.862 [2024-11-30 15:44:34.811227] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713191 ] 00:07:27.429 [2024-11-30 15:44:35.127193] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:27.429 [2024-11-30 15:44:35.173216] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.429 [2024-11-30 15:44:35.192400] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.429 [2024-11-30 15:44:35.244827] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:27.429 [2024-11-30 15:44:35.261134] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:27.429 INFO: Running with entropic power schedule (0xFF, 100). 00:07:27.429 INFO: Seed: 595068992 00:07:27.429 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:27.429 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:27.429 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:27.429 INFO: A corpus is not provided, starting from an empty corpus 00:07:27.429 #2 INITED exec/s: 0 rss: 64Mb 00:07:27.429 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:27.429 This may also happen if the target rejected all inputs we tried so far 00:07:27.429 [2024-11-30 15:44:35.310083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.429 [2024-11-30 15:44:35.310110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.688 NEW_FUNC[1/715]: 0x469848 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:27.688 NEW_FUNC[2/715]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:27.688 #3 NEW cov: 12224 ft: 12215 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CrossOver- 00:07:27.688 [2024-11-30 15:44:35.630472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.688 [2024-11-30 15:44:35.630504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.688 [2024-11-30 15:44:35.630562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.688 [2024-11-30 15:44:35.630576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.688 [2024-11-30 15:44:35.630635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.688 [2024-11-30 15:44:35.630650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.948 #4 NEW cov: 12337 ft: 12997 corp: 3/9b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:27.948 [2024-11-30 15:44:35.690145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:07:27.948 [2024-11-30 15:44:35.690172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.948 #6 NEW cov: 12343 ft: 13303 corp: 4/11b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 2 ChangeBit-CrossOver- 00:07:27.948 [2024-11-30 15:44:35.730423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.948 [2024-11-30 15:44:35.730451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.948 [2024-11-30 15:44:35.730511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff26 cdw11:00000000 00:07:27.948 [2024-11-30 15:44:35.730525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.948 [2024-11-30 15:44:35.730582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:27.948 [2024-11-30 15:44:35.730596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.948 #7 NEW cov: 12428 ft: 13474 corp: 5/18b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 InsertByte- 00:07:27.948 [2024-11-30 15:44:35.790586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.948 [2024-11-30 15:44:35.790618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.948 [2024-11-30 15:44:35.790690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.948 [2024-11-30 15:44:35.790714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.948 [2024-11-30 15:44:35.790769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.948 [2024-11-30 15:44:35.790782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.948 [2024-11-30 15:44:35.790835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:27.948 [2024-11-30 15:44:35.790848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.948 #8 NEW cov: 12428 ft: 13930 corp: 6/27b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CopyPart- 00:07:27.948 [2024-11-30 15:44:35.830472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:27.949 [2024-11-30 15:44:35.830497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.949 [2024-11-30 15:44:35.830552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:07:27.949 [2024-11-30 15:44:35.830566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.949 [2024-11-30 15:44:35.830626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.949 [2024-11-30 15:44:35.830640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.949 #9 NEW cov: 12428 ft: 14097 corp: 7/33b lim: 10 exec/s: 0 rss: 72Mb L: 6/9 MS: 1 ShuffleBytes- 00:07:27.949 [2024-11-30 15:44:35.870589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff7b cdw11:00000000 00:07:27.949 [2024-11-30 15:44:35.870619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.949 [2024-11-30 15:44:35.870691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:27.949 [2024-11-30 15:44:35.870706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.949 [2024-11-30 15:44:35.870761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000026ff cdw11:00000000 00:07:27.949 [2024-11-30 15:44:35.870774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.949 [2024-11-30 15:44:35.870827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:27.949 [2024-11-30 15:44:35.870844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.949 #10 NEW cov: 12428 ft: 14170 corp: 8/41b lim: 10 exec/s: 0 rss: 72Mb L: 8/9 MS: 1 InsertByte- 00:07:28.209 [2024-11-30 15:44:35.930395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:35.930420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:35.930478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:28.209 [2024-11-30 15:44:35.930491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.209 #11 NEW cov: 12428 ft: 14345 corp: 9/46b lim: 10 exec/s: 0 rss: 72Mb L: 5/9 MS: 1 EraseBytes- 00:07:28.209 [2024-11-30 15:44:35.970545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:35.970572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:35.970647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff26 cdw11:00000000 00:07:28.209 [2024-11-30 15:44:35.970662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:35.970717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007f0a cdw11:00000000 00:07:28.209 [2024-11-30 15:44:35.970733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.209 #12 NEW cov: 12428 ft: 14428 corp: 10/53b lim: 10 exec/s: 0 rss: 72Mb L: 7/9 MS: 1 ChangeBit- 00:07:28.209 [2024-11-30 15:44:36.010298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.010324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.209 #13 NEW cov: 12428 ft: 14537 corp: 11/55b lim: 10 exec/s: 0 rss: 72Mb L: 2/9 MS: 1 ShuffleBytes- 00:07:28.209 [2024-11-30 15:44:36.070721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.070747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:36.070801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.070815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:36.070870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.070882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:36.070937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.070950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.209 #14 NEW cov: 12428 ft: 14576 corp: 12/64b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ShuffleBytes- 00:07:28.209 [2024-11-30 15:44:36.130437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.130462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:36.130518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.130535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.209 #16 NEW cov: 12428 ft: 14600 corp: 13/69b lim: 10 exec/s: 0 rss: 72Mb L: 5/9 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:28.209 [2024-11-30 15:44:36.170581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.170612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:36.170666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.170681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.209 [2024-11-30 15:44:36.170736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.209 [2024-11-30 15:44:36.170749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.469 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:28.469 #17 NEW cov: 12451 ft: 14646 corp: 14/75b lim: 10 exec/s: 0 rss: 73Mb L: 6/9 MS: 1 CrossOver- 00:07:28.469 [2024-11-30 15:44:36.230683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002d2d cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.230709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.469 [2024-11-30 15:44:36.230791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002d2d cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.230805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.469 [2024-11-30 15:44:36.230860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d0a cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.230874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.469 #18 NEW cov: 12451 ft: 14733 corp: 15/82b lim: 10 exec/s: 0 rss: 73Mb L: 7/9 MS: 1 InsertRepeatedBytes- 00:07:28.469 [2024-11-30 15:44:36.270629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.270654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.469 [2024-11-30 15:44:36.270710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff26 cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.270723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.469 [2024-11-30 15:44:36.270778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.270791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.469 #19 NEW cov: 12451 ft: 14746 corp: 16/89b lim: 10 exec/s: 0 rss: 73Mb L: 7/9 MS: 1 ChangeBit- 00:07:28.469 [2024-11-30 15:44:36.310407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002501 cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.310432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.469 #21 NEW cov: 12451 ft: 14858 corp: 17/91b lim: 10 exec/s: 21 rss: 73Mb L: 2/9 MS: 2 ChangeBinInt-InsertByte- 00:07:28.469 [2024-11-30 15:44:36.350811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.350840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.469 [2024-11-30 15:44:36.350914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff26 cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.350928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.469 [2024-11-30 15:44:36.350983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007f0a cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.350996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.469 [2024-11-30 15:44:36.351050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002501 cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.351063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.469 #22 NEW cov: 12451 ft: 14891 corp: 18/100b lim: 10 exec/s: 22 rss: 73Mb L: 9/9 MS: 1 CrossOver- 00:07:28.469 [2024-11-30 15:44:36.410470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002501 cdw11:00000000 00:07:28.469 [2024-11-30 15:44:36.410495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.728 #23 NEW cov: 12451 ft: 14962 corp: 19/103b lim: 10 exec/s: 23 rss: 73Mb L: 3/9 MS: 1 InsertByte- 00:07:28.728 [2024-11-30 15:44:36.470617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.470643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.470699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000fffe cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.470713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.729 #24 NEW cov: 12451 ft: 14976 corp: 20/108b lim: 10 exec/s: 24 rss: 73Mb L: 5/9 MS: 1 ChangeBit- 00:07:28.729 [2024-11-30 15:44:36.530912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff7b cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.530938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.531009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.531023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.531077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00009aff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.531091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.531144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.531157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.729 #25 NEW cov: 12451 ft: 14981 corp: 21/116b lim: 10 exec/s: 25 rss: 73Mb L: 8/9 MS: 1 ChangeByte- 00:07:28.729 [2024-11-30 15:44:36.590681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.590706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.590775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.590792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.729 #26 NEW cov: 12451 ft: 15044 corp: 22/121b lim: 10 exec/s: 26 rss: 73Mb L: 5/9 MS: 1 ShuffleBytes- 00:07:28.729 [2024-11-30 15:44:36.630798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.630823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.630894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.630909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.630965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.630979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.729 #27 NEW cov: 12451 ft: 15082 corp: 23/127b lim: 10 exec/s: 27 rss: 73Mb L: 6/9 MS: 1 CopyPart- 00:07:28.729 [2024-11-30 15:44:36.690879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.690904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.690962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.690976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.729 [2024-11-30 15:44:36.691031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.729 [2024-11-30 15:44:36.691045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.989 #28 NEW cov: 12451 ft: 15086 corp: 24/133b lim: 10 exec/s: 28 rss: 73Mb L: 6/9 MS: 1 ShuffleBytes- 00:07:28.989 [2024-11-30 15:44:36.730619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.730644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.989 #29 NEW cov: 12451 ft: 15119 corp: 25/135b lim: 10 exec/s: 29 rss: 73Mb L: 2/9 MS: 1 CopyPart- 00:07:28.989 [2024-11-30 15:44:36.770593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002501 cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.770622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.989 #30 NEW cov: 12451 ft: 15150 corp: 26/137b lim: 10 exec/s: 30 rss: 73Mb L: 2/9 MS: 1 CrossOver- 00:07:28.989 [2024-11-30 15:44:36.810750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.810776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.989 [2024-11-30 15:44:36.810848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.810863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.989 #31 NEW cov: 12451 ft: 15166 corp: 27/141b lim: 10 exec/s: 31 rss: 73Mb L: 4/9 MS: 1 EraseBytes- 00:07:28.989 [2024-11-30 15:44:36.850930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.850955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.989 [2024-11-30 15:44:36.851030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003fff cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.851045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.989 [2024-11-30 15:44:36.851101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.851114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.989 #32 NEW cov: 12451 ft: 15183 corp: 28/148b lim: 10 exec/s: 32 rss: 73Mb L: 7/9 MS: 1 InsertByte- 00:07:28.989 [2024-11-30 15:44:36.911209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.911234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.989 [2024-11-30 15:44:36.911307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.989 [2024-11-30 15:44:36.911321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.989 [2024-11-30 15:44:36.911379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.990 [2024-11-30 15:44:36.911392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.990 [2024-11-30 15:44:36.911447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.990 [2024-11-30 15:44:36.911461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.990 [2024-11-30 15:44:36.911516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.990 [2024-11-30 15:44:36.911529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:28.990 #33 NEW cov: 12451 ft: 15252 corp: 29/158b lim: 10 exec/s: 33 rss: 73Mb L: 10/10 MS: 1 CopyPart- 00:07:28.990 [2024-11-30 15:44:36.950979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:28.990 [2024-11-30 15:44:36.951004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.990 [2024-11-30 15:44:36.951062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:28.990 [2024-11-30 15:44:36.951076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.990 [2024-11-30 15:44:36.951132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000aff cdw11:00000000 00:07:28.990 [2024-11-30 15:44:36.951145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.250 #34 NEW cov: 12451 ft: 15280 corp: 30/164b lim: 10 exec/s: 34 rss: 73Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:29.250 [2024-11-30 15:44:36.991123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000024ff cdw11:00000000 00:07:29.250 [2024-11-30 15:44:36.991148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:36.991221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.250 [2024-11-30 15:44:36.991235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:36.991296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000267f cdw11:00000000 00:07:29.250 [2024-11-30 15:44:36.991310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:36.991366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:29.250 [2024-11-30 15:44:36.991379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.250 #35 NEW cov: 12451 ft: 15303 corp: 31/172b lim: 10 exec/s: 35 rss: 73Mb L: 8/10 MS: 1 InsertByte- 00:07:29.250 [2024-11-30 15:44:37.030754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.030779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.250 #36 NEW cov: 12451 ft: 15354 corp: 32/174b lim: 10 exec/s: 36 rss: 73Mb L: 2/10 MS: 1 ChangeBit- 00:07:29.250 [2024-11-30 15:44:37.071093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.071117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:37.071191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff26 cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.071205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:37.071261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007f0a cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.071275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.250 #37 NEW cov: 12451 ft: 15361 corp: 33/181b lim: 10 exec/s: 37 rss: 73Mb L: 7/10 MS: 1 ChangeByte- 00:07:29.250 [2024-11-30 15:44:37.111108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.111132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:37.111202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff26 cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.111216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:37.111270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007f0a cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.111284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.250 #38 NEW cov: 12451 ft: 15368 corp: 34/188b lim: 10 exec/s: 38 rss: 73Mb L: 7/10 MS: 1 EraseBytes- 00:07:29.250 [2024-11-30 15:44:37.171083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.171108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:37.171179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.171194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:37.171248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.171261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.250 #39 NEW cov: 12451 ft: 15380 corp: 35/194b lim: 10 exec/s: 39 rss: 73Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:29.250 [2024-11-30 15:44:37.211139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.211165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:37.211220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff1e cdw11:00000000 00:07:29.250 [2024-11-30 15:44:37.211234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.250 [2024-11-30 15:44:37.211287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007f0a cdw11:00000000 00:07:29.251 [2024-11-30 15:44:37.211300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.511 #40 NEW cov: 12451 ft: 15389 corp: 36/201b lim: 10 exec/s: 40 rss: 74Mb L: 7/10 MS: 1 ChangeByte- 00:07:29.511 [2024-11-30 15:44:37.271280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.511 [2024-11-30 15:44:37.271306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.511 [2024-11-30 15:44:37.271358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.511 [2024-11-30 15:44:37.271372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.511 [2024-11-30 15:44:37.271424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:29.511 [2024-11-30 15:44:37.271437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.511 [2024-11-30 15:44:37.271491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:29.511 [2024-11-30 15:44:37.271504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.511 #41 NEW cov: 12451 ft: 15446 corp: 37/210b lim: 10 exec/s: 20 rss: 74Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:29.511 #41 DONE cov: 12451 ft: 15446 corp: 37/210b lim: 10 exec/s: 20 rss: 74Mb 00:07:29.511 Done 41 runs in 2 second(s) 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:29.511 15:44:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:07:29.511 [2024-11-30 15:44:37.455701] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:29.511 [2024-11-30 15:44:37.455793] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713566 ] 00:07:30.081 [2024-11-30 15:44:37.769811] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:30.081 [2024-11-30 15:44:37.816264] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.082 [2024-11-30 15:44:37.839909] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.082 [2024-11-30 15:44:37.892211] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:30.082 [2024-11-30 15:44:37.908528] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:30.082 INFO: Running with entropic power schedule (0xFF, 100). 00:07:30.082 INFO: Seed: 3243297219 00:07:30.082 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:30.082 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:30.082 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:30.082 INFO: A corpus is not provided, starting from an empty corpus 00:07:30.082 #2 INITED exec/s: 0 rss: 64Mb 00:07:30.082 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:30.082 This may also happen if the target rejected all inputs we tried so far 00:07:30.082 [2024-11-30 15:44:37.953767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:30.082 [2024-11-30 15:44:37.953795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.342 NEW_FUNC[1/715]: 0x46a248 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:30.342 NEW_FUNC[2/715]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:30.342 #3 NEW cov: 12216 ft: 12214 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CrossOver- 00:07:30.342 [2024-11-30 15:44:38.273841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:30.342 [2024-11-30 15:44:38.273872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.342 #5 NEW cov: 12337 ft: 12864 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 ShuffleBytes-CrossOver- 00:07:30.601 [2024-11-30 15:44:38.313742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:30.601 [2024-11-30 15:44:38.313769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.601 #6 NEW cov: 12343 ft: 13158 corp: 4/7b lim: 10 exec/s: 0 rss: 73Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:30.601 [2024-11-30 15:44:38.373786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:30.601 [2024-11-30 15:44:38.373815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.601 #7 NEW cov: 12428 ft: 13439 corp: 5/9b lim: 10 exec/s: 0 rss: 73Mb L: 2/2 MS: 1 ChangeBit- 00:07:30.601 [2024-11-30 15:44:38.414051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001717 cdw11:00000000 00:07:30.601 [2024-11-30 15:44:38.414076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.601 [2024-11-30 15:44:38.414146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001717 cdw11:00000000 00:07:30.601 [2024-11-30 15:44:38.414160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.601 [2024-11-30 15:44:38.414216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000170a cdw11:00000000 00:07:30.601 [2024-11-30 15:44:38.414229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.601 #8 NEW cov: 12428 ft: 13782 corp: 6/15b lim: 10 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:30.601 [2024-11-30 15:44:38.453843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a8a cdw11:00000000 00:07:30.602 [2024-11-30 15:44:38.453869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.602 #9 NEW cov: 12428 ft: 13857 corp: 7/17b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 1 ChangeBit- 00:07:30.602 [2024-11-30 15:44:38.513873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2c cdw11:00000000 00:07:30.602 [2024-11-30 15:44:38.513900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.602 #10 NEW cov: 12428 ft: 13976 corp: 8/19b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 1 ChangeByte- 00:07:30.602 [2024-11-30 15:44:38.553861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000808 cdw11:00000000 00:07:30.602 [2024-11-30 15:44:38.553887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.861 #13 NEW cov: 12428 ft: 14000 corp: 9/21b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 3 EraseBytes-ChangeBit-CopyPart- 00:07:30.861 [2024-11-30 15:44:38.613860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:30.861 [2024-11-30 15:44:38.613886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.861 #16 NEW cov: 12428 ft: 14075 corp: 10/23b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 3 EraseBytes-CrossOver-CopyPart- 00:07:30.861 [2024-11-30 15:44:38.673860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a03 cdw11:00000000 00:07:30.861 [2024-11-30 15:44:38.673886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.861 #17 NEW cov: 12428 ft: 14120 corp: 11/25b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 1 ChangeBit- 00:07:30.861 [2024-11-30 15:44:38.733911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:30.861 [2024-11-30 15:44:38.733936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.861 #18 NEW cov: 12428 ft: 14146 corp: 12/27b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 1 CrossOver- 00:07:30.861 [2024-11-30 15:44:38.773995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:30.861 [2024-11-30 15:44:38.774021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.861 #19 NEW cov: 12428 ft: 14197 corp: 13/30b lim: 10 exec/s: 0 rss: 73Mb L: 3/6 MS: 1 CrossOver- 00:07:30.861 [2024-11-30 15:44:38.814215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000170a cdw11:00000000 00:07:30.861 [2024-11-30 15:44:38.814241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.861 [2024-11-30 15:44:38.814296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002c17 cdw11:00000000 00:07:30.861 [2024-11-30 15:44:38.814311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.861 [2024-11-30 15:44:38.814367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000170a cdw11:00000000 00:07:30.861 [2024-11-30 15:44:38.814381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.122 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:31.122 #20 NEW cov: 12451 ft: 14239 corp: 14/36b lim: 10 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 CrossOver- 00:07:31.122 [2024-11-30 15:44:38.873996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002f0a cdw11:00000000 00:07:31.122 [2024-11-30 15:44:38.874021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.122 #21 NEW cov: 12451 ft: 14246 corp: 15/39b lim: 10 exec/s: 0 rss: 73Mb L: 3/6 MS: 1 InsertByte- 00:07:31.122 [2024-11-30 15:44:38.914377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e2ff cdw11:00000000 00:07:31.122 [2024-11-30 15:44:38.914402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.122 [2024-11-30 15:44:38.914473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:31.122 [2024-11-30 15:44:38.914487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.122 [2024-11-30 15:44:38.914543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:31.122 [2024-11-30 15:44:38.914556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.122 [2024-11-30 15:44:38.914611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:31.122 [2024-11-30 15:44:38.914625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.122 #25 NEW cov: 12451 ft: 14563 corp: 16/48b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 4 EraseBytes-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:07:31.122 [2024-11-30 15:44:38.954027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000190a cdw11:00000000 00:07:31.122 [2024-11-30 15:44:38.954052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.122 #26 NEW cov: 12451 ft: 14686 corp: 17/51b lim: 10 exec/s: 26 rss: 73Mb L: 3/9 MS: 1 InsertByte- 00:07:31.122 [2024-11-30 15:44:38.994210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.122 [2024-11-30 15:44:38.994235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.122 [2024-11-30 15:44:38.994290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a2c cdw11:00000000 00:07:31.122 [2024-11-30 15:44:38.994304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.122 #27 NEW cov: 12451 ft: 14836 corp: 18/55b lim: 10 exec/s: 27 rss: 73Mb L: 4/9 MS: 1 CrossOver- 00:07:31.122 [2024-11-30 15:44:39.054139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.122 [2024-11-30 15:44:39.054164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.122 #28 NEW cov: 12451 ft: 14871 corp: 19/58b lim: 10 exec/s: 28 rss: 73Mb L: 3/9 MS: 1 ChangeBinInt- 00:07:31.382 [2024-11-30 15:44:39.094280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:31.382 [2024-11-30 15:44:39.094307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.382 [2024-11-30 15:44:39.094364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000808 cdw11:00000000 00:07:31.382 [2024-11-30 15:44:39.094379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.382 #29 NEW cov: 12451 ft: 14897 corp: 20/62b lim: 10 exec/s: 29 rss: 73Mb L: 4/9 MS: 1 CrossOver- 00:07:31.382 [2024-11-30 15:44:39.134160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000410a cdw11:00000000 00:07:31.382 [2024-11-30 15:44:39.134185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.382 #30 NEW cov: 12451 ft: 14922 corp: 21/65b lim: 10 exec/s: 30 rss: 74Mb L: 3/9 MS: 1 InsertByte- 00:07:31.382 [2024-11-30 15:44:39.194349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:31.382 [2024-11-30 15:44:39.194375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.382 [2024-11-30 15:44:39.194446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a2c cdw11:00000000 00:07:31.382 [2024-11-30 15:44:39.194461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.382 #31 NEW cov: 12451 ft: 14934 corp: 22/69b lim: 10 exec/s: 31 rss: 74Mb L: 4/9 MS: 1 InsertByte- 00:07:31.382 [2024-11-30 15:44:39.234201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001927 cdw11:00000000 00:07:31.382 [2024-11-30 15:44:39.234226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.382 #32 NEW cov: 12451 ft: 14980 corp: 23/72b lim: 10 exec/s: 32 rss: 74Mb L: 3/9 MS: 1 ChangeByte- 00:07:31.382 [2024-11-30 15:44:39.294247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000170a cdw11:00000000 00:07:31.382 [2024-11-30 15:44:39.294272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.382 #33 NEW cov: 12451 ft: 14991 corp: 24/75b lim: 10 exec/s: 33 rss: 74Mb L: 3/9 MS: 1 EraseBytes- 00:07:31.669 [2024-11-30 15:44:39.354292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000970a cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.354318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.669 #34 NEW cov: 12451 ft: 15012 corp: 25/78b lim: 10 exec/s: 34 rss: 74Mb L: 3/9 MS: 1 InsertByte- 00:07:31.669 [2024-11-30 15:44:39.394515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e8f5 cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.394540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.669 [2024-11-30 15:44:39.394615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002c17 cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.394629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.669 [2024-11-30 15:44:39.394683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000170a cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.394710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.669 #35 NEW cov: 12451 ft: 15019 corp: 26/84b lim: 10 exec/s: 35 rss: 74Mb L: 6/9 MS: 1 ChangeBinInt- 00:07:31.669 [2024-11-30 15:44:39.434342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.434367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.669 #36 NEW cov: 12451 ft: 15028 corp: 27/86b lim: 10 exec/s: 36 rss: 74Mb L: 2/9 MS: 1 ShuffleBytes- 00:07:31.669 [2024-11-30 15:44:39.494484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008c0a cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.494508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.669 [2024-11-30 15:44:39.494579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002a0a cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.494593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.669 #37 NEW cov: 12451 ft: 15031 corp: 28/91b lim: 10 exec/s: 37 rss: 74Mb L: 5/9 MS: 1 InsertByte- 00:07:31.669 [2024-11-30 15:44:39.554370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.554396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.669 #38 NEW cov: 12451 ft: 15036 corp: 29/93b lim: 10 exec/s: 38 rss: 74Mb L: 2/9 MS: 1 CopyPart- 00:07:31.669 [2024-11-30 15:44:39.594408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0b cdw11:00000000 00:07:31.669 [2024-11-30 15:44:39.594434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.984 #39 NEW cov: 12451 ft: 15053 corp: 30/95b lim: 10 exec/s: 39 rss: 74Mb L: 2/9 MS: 1 ChangeBit- 00:07:31.984 [2024-11-30 15:44:39.634567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.634593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.984 [2024-11-30 15:44:39.634657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.634672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.984 #40 NEW cov: 12451 ft: 15095 corp: 31/99b lim: 10 exec/s: 40 rss: 74Mb L: 4/9 MS: 1 CopyPart- 00:07:31.984 [2024-11-30 15:44:39.694461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000190a cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.694487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.984 #41 NEW cov: 12451 ft: 15106 corp: 32/102b lim: 10 exec/s: 41 rss: 74Mb L: 3/9 MS: 1 ShuffleBytes- 00:07:31.984 [2024-11-30 15:44:39.734968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.734994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.984 [2024-11-30 15:44:39.735047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.735060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.984 [2024-11-30 15:44:39.735113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.735130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.984 [2024-11-30 15:44:39.735183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.735196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.984 [2024-11-30 15:44:39.735249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff2c cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.735262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:31.984 #42 NEW cov: 12451 ft: 15153 corp: 33/112b lim: 10 exec/s: 42 rss: 74Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:31.984 [2024-11-30 15:44:39.774476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000c08 cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.774501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.984 #43 NEW cov: 12451 ft: 15191 corp: 34/114b lim: 10 exec/s: 43 rss: 74Mb L: 2/10 MS: 1 ChangeBit- 00:07:31.984 [2024-11-30 15:44:39.834853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2c cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.834880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.984 [2024-11-30 15:44:39.834934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.834947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.984 [2024-11-30 15:44:39.834999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.835013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.984 [2024-11-30 15:44:39.835065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.835078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.984 #44 NEW cov: 12451 ft: 15201 corp: 35/122b lim: 10 exec/s: 44 rss: 74Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:31.984 [2024-11-30 15:44:39.874610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:31.984 [2024-11-30 15:44:39.874636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.985 [2024-11-30 15:44:39.874690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:31.985 [2024-11-30 15:44:39.874704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.985 #45 NEW cov: 12451 ft: 15210 corp: 36/127b lim: 10 exec/s: 45 rss: 74Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:07:31.985 [2024-11-30 15:44:39.914879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009f9f cdw11:00000000 00:07:31.985 [2024-11-30 15:44:39.914908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.985 [2024-11-30 15:44:39.914971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009f9f cdw11:00000000 00:07:31.985 [2024-11-30 15:44:39.914991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.985 [2024-11-30 15:44:39.915055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009f9f cdw11:00000000 00:07:31.985 [2024-11-30 15:44:39.915072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.985 [2024-11-30 15:44:39.915133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00009f08 cdw11:00000000 00:07:31.985 [2024-11-30 15:44:39.915152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.282 #46 NEW cov: 12451 ft: 15278 corp: 37/136b lim: 10 exec/s: 23 rss: 74Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:32.282 #46 DONE cov: 12451 ft: 15278 corp: 37/136b lim: 10 exec/s: 23 rss: 74Mb 00:07:32.282 Done 46 runs in 2 second(s) 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:32.282 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:32.283 15:44:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:07:32.283 [2024-11-30 15:44:40.093292] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:32.283 [2024-11-30 15:44:40.093384] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1714021 ] 00:07:32.542 [2024-11-30 15:44:40.410217] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:32.542 [2024-11-30 15:44:40.458480] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.542 [2024-11-30 15:44:40.480692] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.801 [2024-11-30 15:44:40.533664] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.801 [2024-11-30 15:44:40.549980] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:32.801 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.801 INFO: Seed: 1591136784 00:07:32.801 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:32.801 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:32.801 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:32.801 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.801 [2024-11-30 15:44:40.605241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.801 [2024-11-30 15:44:40.605270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.801 #2 INITED cov: 12243 ft: 12241 corp: 1/1b exec/s: 0 rss: 71Mb 00:07:32.801 [2024-11-30 15:44:40.645302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.801 [2024-11-30 15:44:40.645329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.801 [2024-11-30 15:44:40.645385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.801 [2024-11-30 15:44:40.645399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.801 #3 NEW cov: 12364 ft: 13377 corp: 2/3b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 InsertByte- 00:07:32.801 [2024-11-30 15:44:40.705326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.802 [2024-11-30 15:44:40.705351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.802 [2024-11-30 15:44:40.705408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.802 [2024-11-30 15:44:40.705422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.802 #4 NEW cov: 12370 ft: 13679 corp: 3/5b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CrossOver- 00:07:32.802 [2024-11-30 15:44:40.745342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.802 [2024-11-30 15:44:40.745369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.802 [2024-11-30 15:44:40.745425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.802 [2024-11-30 15:44:40.745440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.062 #5 NEW cov: 12455 ft: 13920 corp: 4/7b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeByte- 00:07:33.062 [2024-11-30 15:44:40.805327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:40.805354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.062 [2024-11-30 15:44:40.805411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:40.805426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.062 #6 NEW cov: 12455 ft: 13966 corp: 5/9b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeByte- 00:07:33.062 [2024-11-30 15:44:40.865227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:40.865257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.062 #7 NEW cov: 12455 ft: 14028 corp: 6/10b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 EraseBytes- 00:07:33.062 [2024-11-30 15:44:40.905525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:40.905551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.062 [2024-11-30 15:44:40.905611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:40.905625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.062 [2024-11-30 15:44:40.905681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:40.905695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.062 #8 NEW cov: 12455 ft: 14297 corp: 7/13b lim: 5 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 InsertByte- 00:07:33.062 [2024-11-30 15:44:40.965404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:40.965429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.062 [2024-11-30 15:44:40.965487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:40.965501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.062 #9 NEW cov: 12455 ft: 14399 corp: 8/15b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 ShuffleBytes- 00:07:33.062 [2024-11-30 15:44:41.025236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.062 [2024-11-30 15:44:41.025261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 #10 NEW cov: 12455 ft: 14454 corp: 9/16b lim: 5 exec/s: 0 rss: 72Mb L: 1/3 MS: 1 ChangeBit- 00:07:33.321 [2024-11-30 15:44:41.065433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-11-30 15:44:41.065458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 [2024-11-30 15:44:41.065530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-11-30 15:44:41.065544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.321 #11 NEW cov: 12455 ft: 14562 corp: 10/18b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 InsertByte- 00:07:33.321 [2024-11-30 15:44:41.105421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-11-30 15:44:41.105446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 [2024-11-30 15:44:41.105517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-11-30 15:44:41.105534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.321 #12 NEW cov: 12455 ft: 14600 corp: 11/20b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 ChangeBit- 00:07:33.321 [2024-11-30 15:44:41.165276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-11-30 15:44:41.165302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 #13 NEW cov: 12455 ft: 14665 corp: 12/21b lim: 5 exec/s: 0 rss: 72Mb L: 1/3 MS: 1 ChangeBit- 00:07:33.321 [2024-11-30 15:44:41.225624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-11-30 15:44:41.225649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.321 [2024-11-30 15:44:41.225725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-11-30 15:44:41.225749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.321 [2024-11-30 15:44:41.225803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.321 [2024-11-30 15:44:41.225817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.322 #14 NEW cov: 12455 ft: 14682 corp: 13/24b lim: 5 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 InsertByte- 00:07:33.322 [2024-11-30 15:44:41.265469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.322 [2024-11-30 15:44:41.265494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.322 [2024-11-30 15:44:41.265550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.322 [2024-11-30 15:44:41.265564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.582 #15 NEW cov: 12455 ft: 14775 corp: 14/26b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 CopyPart- 00:07:33.582 [2024-11-30 15:44:41.305482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.582 [2024-11-30 15:44:41.305507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.582 [2024-11-30 15:44:41.305580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.582 [2024-11-30 15:44:41.305594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.582 #16 NEW cov: 12455 ft: 14825 corp: 15/28b lim: 5 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 ShuffleBytes- 00:07:33.582 [2024-11-30 15:44:41.345696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.582 [2024-11-30 15:44:41.345721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.582 [2024-11-30 15:44:41.345793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.582 [2024-11-30 15:44:41.345808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.582 [2024-11-30 15:44:41.345865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.582 [2024-11-30 15:44:41.345878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.582 #17 NEW cov: 12455 ft: 14851 corp: 16/31b lim: 5 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 InsertByte- 00:07:33.582 [2024-11-30 15:44:41.405390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.582 [2024-11-30 15:44:41.405415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.582 #18 NEW cov: 12455 ft: 14868 corp: 17/32b lim: 5 exec/s: 0 rss: 72Mb L: 1/3 MS: 1 EraseBytes- 00:07:33.582 [2024-11-30 15:44:41.465428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.582 [2024-11-30 15:44:41.465453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.840 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:33.840 #19 NEW cov: 12478 ft: 14920 corp: 18/33b lim: 5 exec/s: 19 rss: 73Mb L: 1/3 MS: 1 ChangeBinInt- 00:07:33.840 [2024-11-30 15:44:41.776141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.840 [2024-11-30 15:44:41.776180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.840 [2024-11-30 15:44:41.776243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.840 [2024-11-30 15:44:41.776261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.840 [2024-11-30 15:44:41.776323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.840 [2024-11-30 15:44:41.776340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.840 [2024-11-30 15:44:41.776402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.840 [2024-11-30 15:44:41.776419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.840 #20 NEW cov: 12478 ft: 15281 corp: 19/37b lim: 5 exec/s: 20 rss: 73Mb L: 4/4 MS: 1 CopyPart- 00:07:34.100 [2024-11-30 15:44:41.816105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.816132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.100 [2024-11-30 15:44:41.816188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.816201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.100 [2024-11-30 15:44:41.816259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.816272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.100 [2024-11-30 15:44:41.816338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.816351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.100 [2024-11-30 15:44:41.816406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.816419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.100 #21 NEW cov: 12478 ft: 15340 corp: 20/42b lim: 5 exec/s: 21 rss: 74Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:34.100 [2024-11-30 15:44:41.875648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.875675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.100 [2024-11-30 15:44:41.875748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.875763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.100 #22 NEW cov: 12478 ft: 15356 corp: 21/44b lim: 5 exec/s: 22 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:07:34.100 [2024-11-30 15:44:41.935503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.935530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.100 #23 NEW cov: 12478 ft: 15373 corp: 22/45b lim: 5 exec/s: 23 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:07:34.100 [2024-11-30 15:44:41.975698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.975725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.100 [2024-11-30 15:44:41.975779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:41.975793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.100 #24 NEW cov: 12478 ft: 15402 corp: 23/47b lim: 5 exec/s: 24 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:07:34.100 [2024-11-30 15:44:42.015732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:42.015758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.100 [2024-11-30 15:44:42.015814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.100 [2024-11-30 15:44:42.015827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.100 #25 NEW cov: 12478 ft: 15411 corp: 24/49b lim: 5 exec/s: 25 rss: 74Mb L: 2/5 MS: 1 ChangeBit- 00:07:34.359 [2024-11-30 15:44:42.075629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.359 [2024-11-30 15:44:42.075655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.359 #26 NEW cov: 12478 ft: 15433 corp: 25/50b lim: 5 exec/s: 26 rss: 74Mb L: 1/5 MS: 1 CopyPart- 00:07:34.359 [2024-11-30 15:44:42.115780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.359 [2024-11-30 15:44:42.115805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.359 [2024-11-30 15:44:42.115864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.359 [2024-11-30 15:44:42.115878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.359 #27 NEW cov: 12478 ft: 15453 corp: 26/52b lim: 5 exec/s: 27 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:07:34.359 [2024-11-30 15:44:42.175783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.359 [2024-11-30 15:44:42.175809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.359 [2024-11-30 15:44:42.175865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.359 [2024-11-30 15:44:42.175878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.359 #28 NEW cov: 12478 ft: 15492 corp: 27/54b lim: 5 exec/s: 28 rss: 74Mb L: 2/5 MS: 1 ChangeByte- 00:07:34.359 [2024-11-30 15:44:42.215688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.359 [2024-11-30 15:44:42.215714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.359 #29 NEW cov: 12478 ft: 15510 corp: 28/55b lim: 5 exec/s: 29 rss: 74Mb L: 1/5 MS: 1 EraseBytes- 00:07:34.359 [2024-11-30 15:44:42.256030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.359 [2024-11-30 15:44:42.256056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.359 [2024-11-30 15:44:42.256112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.359 [2024-11-30 15:44:42.256125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.360 [2024-11-30 15:44:42.256180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.360 [2024-11-30 15:44:42.256193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.360 #30 NEW cov: 12478 ft: 15515 corp: 29/58b lim: 5 exec/s: 30 rss: 74Mb L: 3/5 MS: 1 InsertByte- 00:07:34.360 [2024-11-30 15:44:42.295873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.360 [2024-11-30 15:44:42.295898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.360 [2024-11-30 15:44:42.295973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.360 [2024-11-30 15:44:42.295987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.360 #31 NEW cov: 12478 ft: 15516 corp: 30/60b lim: 5 exec/s: 31 rss: 74Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:34.619 [2024-11-30 15:44:42.335748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.335777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.619 #32 NEW cov: 12478 ft: 15519 corp: 31/61b lim: 5 exec/s: 32 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:07:34.619 [2024-11-30 15:44:42.396251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.396276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.619 [2024-11-30 15:44:42.396332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.396347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.619 [2024-11-30 15:44:42.396404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.396418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.619 [2024-11-30 15:44:42.396473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.396487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.619 #33 NEW cov: 12478 ft: 15532 corp: 32/65b lim: 5 exec/s: 33 rss: 74Mb L: 4/5 MS: 1 CopyPart- 00:07:34.619 [2024-11-30 15:44:42.435782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.435808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.619 #34 NEW cov: 12478 ft: 15539 corp: 33/66b lim: 5 exec/s: 34 rss: 74Mb L: 1/5 MS: 1 EraseBytes- 00:07:34.619 [2024-11-30 15:44:42.475974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.475999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.619 [2024-11-30 15:44:42.476055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.476069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.619 #35 NEW cov: 12478 ft: 15585 corp: 34/68b lim: 5 exec/s: 35 rss: 74Mb L: 2/5 MS: 1 InsertByte- 00:07:34.619 [2024-11-30 15:44:42.536277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.536302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.619 [2024-11-30 15:44:42.536375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.536390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.619 [2024-11-30 15:44:42.536444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.536463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.619 [2024-11-30 15:44:42.536519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.619 [2024-11-30 15:44:42.536533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.619 #36 NEW cov: 12478 ft: 15599 corp: 35/72b lim: 5 exec/s: 36 rss: 74Mb L: 4/5 MS: 1 ChangeByte- 00:07:34.879 [2024-11-30 15:44:42.596205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.879 [2024-11-30 15:44:42.596231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.879 [2024-11-30 15:44:42.596288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.879 [2024-11-30 15:44:42.596302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.879 [2024-11-30 15:44:42.596359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.879 [2024-11-30 15:44:42.596372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.879 #37 NEW cov: 12478 ft: 15614 corp: 36/75b lim: 5 exec/s: 18 rss: 74Mb L: 3/5 MS: 1 InsertByte- 00:07:34.879 #37 DONE cov: 12478 ft: 15614 corp: 36/75b lim: 5 exec/s: 18 rss: 74Mb 00:07:34.879 Done 37 runs in 2 second(s) 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:34.879 15:44:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:07:34.879 [2024-11-30 15:44:42.784535] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:34.879 [2024-11-30 15:44:42.784615] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1714559 ] 00:07:35.138 [2024-11-30 15:44:43.102023] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:35.398 [2024-11-30 15:44:43.149258] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.398 [2024-11-30 15:44:43.168286] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.398 [2024-11-30 15:44:43.220743] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.398 [2024-11-30 15:44:43.237048] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:35.398 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.398 INFO: Seed: 4278104376 00:07:35.398 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:35.398 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:35.398 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:35.398 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.398 [2024-11-30 15:44:43.281756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.398 [2024-11-30 15:44:43.281790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.398 #2 INITED cov: 12251 ft: 12207 corp: 1/1b exec/s: 0 rss: 71Mb 00:07:35.398 [2024-11-30 15:44:43.331824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.398 [2024-11-30 15:44:43.331856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.398 [2024-11-30 15:44:43.331905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.398 [2024-11-30 15:44:43.331922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.398 [2024-11-30 15:44:43.331952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.398 [2024-11-30 15:44:43.331968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.398 [2024-11-30 15:44:43.331997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.398 [2024-11-30 15:44:43.332013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.657 #3 NEW cov: 12364 ft: 13607 corp: 2/5b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:35.657 [2024-11-30 15:44:43.421838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.657 [2024-11-30 15:44:43.421870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.657 [2024-11-30 15:44:43.421903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.657 [2024-11-30 15:44:43.421919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.657 [2024-11-30 15:44:43.421953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.657 [2024-11-30 15:44:43.421969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.657 [2024-11-30 15:44:43.421997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.657 [2024-11-30 15:44:43.422012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.657 #4 NEW cov: 12370 ft: 13897 corp: 3/9b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 ChangeBinInt- 00:07:35.657 [2024-11-30 15:44:43.511609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.657 [2024-11-30 15:44:43.511640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.657 #5 NEW cov: 12455 ft: 14183 corp: 4/10b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 CrossOver- 00:07:35.657 [2024-11-30 15:44:43.571700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.657 [2024-11-30 15:44:43.571731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.657 #6 NEW cov: 12455 ft: 14316 corp: 5/11b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ChangeBit- 00:07:35.658 [2024-11-30 15:44:43.621729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.658 [2024-11-30 15:44:43.621761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.917 #7 NEW cov: 12455 ft: 14418 corp: 6/12b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:35.917 [2024-11-30 15:44:43.671779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-11-30 15:44:43.671810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.917 [2024-11-30 15:44:43.671857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-11-30 15:44:43.671872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.917 #8 NEW cov: 12455 ft: 14659 corp: 7/14b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 InsertByte- 00:07:35.917 [2024-11-30 15:44:43.731895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-11-30 15:44:43.731927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.917 [2024-11-30 15:44:43.731974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-11-30 15:44:43.731991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.917 [2024-11-30 15:44:43.732020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-11-30 15:44:43.732035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.917 [2024-11-30 15:44:43.732063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-11-30 15:44:43.732082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.917 #9 NEW cov: 12455 ft: 14681 corp: 8/18b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 ShuffleBytes- 00:07:35.917 [2024-11-30 15:44:43.821818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-11-30 15:44:43.821848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.917 [2024-11-30 15:44:43.821881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.917 [2024-11-30 15:44:43.821897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.176 #10 NEW cov: 12455 ft: 14723 corp: 9/20b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 CopyPart- 00:07:36.176 [2024-11-30 15:44:43.911768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.176 [2024-11-30 15:44:43.911798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.176 #11 NEW cov: 12455 ft: 14828 corp: 10/21b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ChangeBit- 00:07:36.176 [2024-11-30 15:44:44.001947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.176 [2024-11-30 15:44:44.001976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.176 [2024-11-30 15:44:44.002024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.176 [2024-11-30 15:44:44.002040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.176 [2024-11-30 15:44:44.002068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.177 [2024-11-30 15:44:44.002083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.177 [2024-11-30 15:44:44.002111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.177 [2024-11-30 15:44:44.002127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.177 #12 NEW cov: 12455 ft: 14852 corp: 11/25b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 CopyPart- 00:07:36.177 [2024-11-30 15:44:44.061763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.177 [2024-11-30 15:44:44.061794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.177 #13 NEW cov: 12455 ft: 14872 corp: 12/26b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 CrossOver- 00:07:36.177 [2024-11-30 15:44:44.111772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.177 [2024-11-30 15:44:44.111802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.695 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:36.695 #14 NEW cov: 12478 ft: 14891 corp: 13/27b lim: 5 exec/s: 14 rss: 74Mb L: 1/4 MS: 1 ChangeBit- 00:07:36.695 [2024-11-30 15:44:44.451998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.695 [2024-11-30 15:44:44.452035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.695 [2024-11-30 15:44:44.452083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.695 [2024-11-30 15:44:44.452099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.695 #15 NEW cov: 12478 ft: 14931 corp: 14/29b lim: 5 exec/s: 15 rss: 74Mb L: 2/4 MS: 1 EraseBytes- 00:07:36.695 [2024-11-30 15:44:44.512055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.695 [2024-11-30 15:44:44.512085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.695 [2024-11-30 15:44:44.512118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.695 [2024-11-30 15:44:44.512134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.695 [2024-11-30 15:44:44.512162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.695 [2024-11-30 15:44:44.512178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.695 [2024-11-30 15:44:44.512206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.695 [2024-11-30 15:44:44.512221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.695 #16 NEW cov: 12478 ft: 14965 corp: 15/33b lim: 5 exec/s: 16 rss: 74Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:36.695 [2024-11-30 15:44:44.601905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.695 [2024-11-30 15:44:44.601935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.954 #17 NEW cov: 12478 ft: 14996 corp: 16/34b lim: 5 exec/s: 17 rss: 74Mb L: 1/4 MS: 1 ChangeByte- 00:07:36.954 [2024-11-30 15:44:44.691944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.954 [2024-11-30 15:44:44.691974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.954 #18 NEW cov: 12478 ft: 15019 corp: 17/35b lim: 5 exec/s: 18 rss: 74Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:36.954 [2024-11-30 15:44:44.741937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.954 [2024-11-30 15:44:44.741967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.954 [2024-11-30 15:44:44.742014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.954 [2024-11-30 15:44:44.742030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.954 #19 NEW cov: 12478 ft: 15031 corp: 18/37b lim: 5 exec/s: 19 rss: 74Mb L: 2/4 MS: 1 CopyPart- 00:07:36.954 [2024-11-30 15:44:44.822522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.954 [2024-11-30 15:44:44.822552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.954 #20 NEW cov: 12478 ft: 15221 corp: 19/38b lim: 5 exec/s: 20 rss: 74Mb L: 1/4 MS: 1 ChangeByte- 00:07:36.954 [2024-11-30 15:44:44.862513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.954 [2024-11-30 15:44:44.862539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.954 #21 NEW cov: 12478 ft: 15308 corp: 20/39b lim: 5 exec/s: 21 rss: 74Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:36.954 [2024-11-30 15:44:44.902568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.954 [2024-11-30 15:44:44.902596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.213 #22 NEW cov: 12478 ft: 15330 corp: 21/40b lim: 5 exec/s: 22 rss: 74Mb L: 1/4 MS: 1 ChangeByte- 00:07:37.213 [2024-11-30 15:44:44.962764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:44.962791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.214 [2024-11-30 15:44:44.962850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:44.962865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.214 #23 NEW cov: 12478 ft: 15345 corp: 22/42b lim: 5 exec/s: 23 rss: 74Mb L: 2/4 MS: 1 ChangeByte- 00:07:37.214 [2024-11-30 15:44:45.022618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.022645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.214 #24 NEW cov: 12478 ft: 15382 corp: 23/43b lim: 5 exec/s: 24 rss: 74Mb L: 1/4 MS: 1 CopyPart- 00:07:37.214 [2024-11-30 15:44:45.062785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.062812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.214 [2024-11-30 15:44:45.062869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.062883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.214 #25 NEW cov: 12478 ft: 15397 corp: 24/45b lim: 5 exec/s: 25 rss: 74Mb L: 2/4 MS: 1 InsertByte- 00:07:37.214 [2024-11-30 15:44:45.103261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.103287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.214 [2024-11-30 15:44:45.103345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.103360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.214 [2024-11-30 15:44:45.103418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.103435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.214 [2024-11-30 15:44:45.103492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.103505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.214 [2024-11-30 15:44:45.103561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.103575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.214 #26 NEW cov: 12478 ft: 15449 corp: 25/50b lim: 5 exec/s: 26 rss: 74Mb L: 5/5 MS: 1 CrossOver- 00:07:37.214 [2024-11-30 15:44:45.142811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.142837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.214 [2024-11-30 15:44:45.142896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.214 [2024-11-30 15:44:45.142910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.474 #27 NEW cov: 12478 ft: 15487 corp: 26/52b lim: 5 exec/s: 27 rss: 74Mb L: 2/5 MS: 1 ChangeBit- 00:07:37.474 [2024-11-30 15:44:45.203313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.474 [2024-11-30 15:44:45.203339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.474 [2024-11-30 15:44:45.203413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.474 [2024-11-30 15:44:45.203427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.474 [2024-11-30 15:44:45.203485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.474 [2024-11-30 15:44:45.203499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.474 [2024-11-30 15:44:45.203556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.474 [2024-11-30 15:44:45.203569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.474 [2024-11-30 15:44:45.203631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.474 [2024-11-30 15:44:45.203644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.474 #28 NEW cov: 12478 ft: 15540 corp: 27/57b lim: 5 exec/s: 28 rss: 74Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:37.474 [2024-11-30 15:44:45.262888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.474 [2024-11-30 15:44:45.262915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.474 [2024-11-30 15:44:45.262991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.474 [2024-11-30 15:44:45.263007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.474 #29 NEW cov: 12478 ft: 15576 corp: 28/59b lim: 5 exec/s: 14 rss: 74Mb L: 2/5 MS: 1 ChangeBit- 00:07:37.474 #29 DONE cov: 12478 ft: 15576 corp: 28/59b lim: 5 exec/s: 14 rss: 74Mb 00:07:37.474 Done 29 runs in 2 second(s) 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:37.474 15:44:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:07:37.733 [2024-11-30 15:44:45.452872] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:37.733 [2024-11-30 15:44:45.452943] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1715054 ] 00:07:37.992 [2024-11-30 15:44:45.765096] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:37.992 [2024-11-30 15:44:45.812360] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.992 [2024-11-30 15:44:45.835153] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.992 [2024-11-30 15:44:45.887725] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.992 [2024-11-30 15:44:45.904041] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:37.992 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.992 INFO: Seed: 2650145567 00:07:37.992 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:37.992 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:37.992 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:37.992 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.992 #2 INITED exec/s: 0 rss: 64Mb 00:07:37.992 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.992 This may also happen if the target rejected all inputs we tried so far 00:07:37.992 [2024-11-30 15:44:45.948743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.992 [2024-11-30 15:44:45.948786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.511 NEW_FUNC[1/714]: 0x46bbc8 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:38.511 NEW_FUNC[2/714]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.511 #11 NEW cov: 12268 ft: 12265 corp: 2/10b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 4 ChangeBinInt-InsertByte-EraseBytes-CMP- DE: "\000\000\000\000\000\000\000i"- 00:07:38.511 [2024-11-30 15:44:46.288726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000069 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.511 [2024-11-30 15:44:46.288763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.511 NEW_FUNC[1/2]: 0xfc9ac8 in spdk_get_ticks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:321 00:07:38.511 NEW_FUNC[2/2]: 0x1a69b78 in nvme_tcp_read_data /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/nvme_tcp.h:405 00:07:38.511 #12 NEW cov: 12387 ft: 12860 corp: 3/19b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000i"- 00:07:38.511 [2024-11-30 15:44:46.378670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:0000003f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.511 [2024-11-30 15:44:46.378701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.511 #13 NEW cov: 12393 ft: 13084 corp: 4/28b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:07:38.511 [2024-11-30 15:44:46.428694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000069 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.511 [2024-11-30 15:44:46.428725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.770 #14 NEW cov: 12478 ft: 13295 corp: 5/37b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:07:38.770 [2024-11-30 15:44:46.518743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.770 [2024-11-30 15:44:46.518775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.770 #16 NEW cov: 12478 ft: 13503 corp: 6/46b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 2 ShuffleBytes-PersAutoDict- DE: "\000\000\000\000\000\000\000i"- 00:07:38.770 [2024-11-30 15:44:46.578881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.770 [2024-11-30 15:44:46.578913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.770 [2024-11-30 15:44:46.578950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.770 [2024-11-30 15:44:46.578967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.770 [2024-11-30 15:44:46.578998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.770 [2024-11-30 15:44:46.579019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.770 [2024-11-30 15:44:46.579051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.770 [2024-11-30 15:44:46.579067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.770 #19 NEW cov: 12478 ft: 14195 corp: 7/85b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:38.770 [2024-11-30 15:44:46.638688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e1000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.770 [2024-11-30 15:44:46.638718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.770 #20 NEW cov: 12478 ft: 14278 corp: 8/95b lim: 40 exec/s: 0 rss: 72Mb L: 10/39 MS: 1 InsertByte- 00:07:38.770 [2024-11-30 15:44:46.728814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.770 [2024-11-30 15:44:46.728844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.770 [2024-11-30 15:44:46.728893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00006900 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.770 [2024-11-30 15:44:46.728909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.030 #21 NEW cov: 12478 ft: 14566 corp: 9/113b lim: 40 exec/s: 0 rss: 72Mb L: 18/39 MS: 1 CrossOver- 00:07:39.030 [2024-11-30 15:44:46.799279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.799306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.030 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:39.030 #22 NEW cov: 12501 ft: 14761 corp: 10/122b lim: 40 exec/s: 0 rss: 72Mb L: 9/39 MS: 1 CopyPart- 00:07:39.030 [2024-11-30 15:44:46.859701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.859733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.859792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.859807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.859864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.859878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.859936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.859950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.030 #23 NEW cov: 12501 ft: 14844 corp: 11/161b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:39.030 [2024-11-30 15:44:46.919743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.919768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.919826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.919840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.919901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.919914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.919967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a0100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.919981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.030 #24 NEW cov: 12501 ft: 14892 corp: 12/200b lim: 40 exec/s: 24 rss: 72Mb L: 39/39 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:07:39.030 [2024-11-30 15:44:46.979866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.979891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.979950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.979963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.980021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.980035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.980091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a01 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.980104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.030 [2024-11-30 15:44:46.980163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:0000012a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.030 [2024-11-30 15:44:46.980175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.289 #25 NEW cov: 12501 ft: 14938 corp: 13/240b lim: 40 exec/s: 25 rss: 72Mb L: 40/40 MS: 1 InsertByte- 00:07:39.289 [2024-11-30 15:44:47.039377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000069 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.039401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.290 #26 NEW cov: 12501 ft: 15071 corp: 14/249b lim: 40 exec/s: 26 rss: 72Mb L: 9/40 MS: 1 ShuffleBytes- 00:07:39.290 [2024-11-30 15:44:47.079758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.079782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.290 [2024-11-30 15:44:47.079860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.079874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.290 [2024-11-30 15:44:47.079931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.079944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.290 [2024-11-30 15:44:47.080002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a27 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.080016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.290 #27 NEW cov: 12501 ft: 15166 corp: 15/288b lim: 40 exec/s: 27 rss: 72Mb L: 39/40 MS: 1 ChangeByte- 00:07:39.290 [2024-11-30 15:44:47.119789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.119813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.290 [2024-11-30 15:44:47.119886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.119900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.290 [2024-11-30 15:44:47.119957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a222a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.119971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.290 [2024-11-30 15:44:47.120028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.120041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.290 #28 NEW cov: 12501 ft: 15175 corp: 16/327b lim: 40 exec/s: 28 rss: 72Mb L: 39/40 MS: 1 ChangeBit- 00:07:39.290 [2024-11-30 15:44:47.159399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e1000000 cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.159423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.290 #29 NEW cov: 12501 ft: 15183 corp: 17/337b lim: 40 exec/s: 29 rss: 72Mb L: 10/40 MS: 1 ChangeBinInt- 00:07:39.290 [2024-11-30 15:44:47.219447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:000000c1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.290 [2024-11-30 15:44:47.219471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.549 #30 NEW cov: 12501 ft: 15188 corp: 18/346b lim: 40 exec/s: 30 rss: 72Mb L: 9/40 MS: 1 ChangeBinInt- 00:07:39.549 [2024-11-30 15:44:47.279873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.279897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.279970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.279987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.280047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.280061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.280115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.280129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.549 #31 NEW cov: 12501 ft: 15202 corp: 19/385b lim: 40 exec/s: 31 rss: 72Mb L: 39/40 MS: 1 ShuffleBytes- 00:07:39.549 [2024-11-30 15:44:47.319731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:96000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.319755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.319831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.319845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.319905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.319919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.549 #33 NEW cov: 12501 ft: 15432 corp: 20/409b lim: 40 exec/s: 33 rss: 72Mb L: 24/40 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:39.549 [2024-11-30 15:44:47.359660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.359684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.359743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00206900 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.359757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.549 #34 NEW cov: 12501 ft: 15475 corp: 21/427b lim: 40 exec/s: 34 rss: 72Mb L: 18/40 MS: 1 ChangeByte- 00:07:39.549 [2024-11-30 15:44:47.419928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.419951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.420009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.420022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.420078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.420091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.420148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.420166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.549 #35 NEW cov: 12501 ft: 15489 corp: 22/460b lim: 40 exec/s: 35 rss: 72Mb L: 33/40 MS: 1 EraseBytes- 00:07:39.549 [2024-11-30 15:44:47.479713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:96000000 cdw11:7a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.479737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.549 [2024-11-30 15:44:47.479813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000002a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.549 [2024-11-30 15:44:47.479826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.808 #36 NEW cov: 12501 ft: 15507 corp: 23/477b lim: 40 exec/s: 36 rss: 72Mb L: 17/40 MS: 1 CrossOver- 00:07:39.808 [2024-11-30 15:44:47.539618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.808 [2024-11-30 15:44:47.539642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.808 #37 NEW cov: 12501 ft: 15524 corp: 24/487b lim: 40 exec/s: 37 rss: 73Mb L: 10/40 MS: 1 CrossOver- 00:07:39.808 [2024-11-30 15:44:47.579606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.808 [2024-11-30 15:44:47.579630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.808 #38 NEW cov: 12501 ft: 15536 corp: 25/500b lim: 40 exec/s: 38 rss: 73Mb L: 13/40 MS: 1 CopyPart- 00:07:39.808 [2024-11-30 15:44:47.619633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.808 [2024-11-30 15:44:47.619658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.808 [2024-11-30 15:44:47.679671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:03000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.808 [2024-11-30 15:44:47.679697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.808 #40 NEW cov: 12501 ft: 15538 corp: 26/513b lim: 40 exec/s: 40 rss: 73Mb L: 13/40 MS: 2 ChangeByte-ChangeBit- 00:07:39.808 [2024-11-30 15:44:47.719914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000069 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.808 [2024-11-30 15:44:47.719938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.808 [2024-11-30 15:44:47.719998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:69e9e9e9 cdw11:e9e9e9e9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.808 [2024-11-30 15:44:47.720012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.808 [2024-11-30 15:44:47.720070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e9e9e9e9 cdw11:e9e9e9e9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.808 [2024-11-30 15:44:47.720083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.809 #41 NEW cov: 12501 ft: 15552 corp: 27/543b lim: 40 exec/s: 41 rss: 73Mb L: 30/40 MS: 1 InsertRepeatedBytes- 00:07:39.809 [2024-11-30 15:44:47.760211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.809 [2024-11-30 15:44:47.760235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.809 [2024-11-30 15:44:47.760295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.809 [2024-11-30 15:44:47.760308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.809 [2024-11-30 15:44:47.760366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2a2a2a2a cdw11:2a2a0100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.809 [2024-11-30 15:44:47.760379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.809 [2024-11-30 15:44:47.760436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a2a cdw11:2a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.809 [2024-11-30 15:44:47.760449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.809 [2024-11-30 15:44:47.760506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:0000012a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.809 [2024-11-30 15:44:47.760519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:40.068 #42 NEW cov: 12501 ft: 15558 corp: 28/583b lim: 40 exec/s: 42 rss: 73Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:40.068 [2024-11-30 15:44:47.819733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e10000c7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.819758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.068 #43 NEW cov: 12501 ft: 15579 corp: 29/593b lim: 40 exec/s: 43 rss: 73Mb L: 10/40 MS: 1 ChangeByte- 00:07:40.068 [2024-11-30 15:44:47.860213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:7a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.860237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.068 [2024-11-30 15:44:47.860296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2ae0d52a cdw11:2a2a2a7a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.860310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.068 [2024-11-30 15:44:47.860369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:282a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.860382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.068 [2024-11-30 15:44:47.860440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2a2a2a01 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.860453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.068 [2024-11-30 15:44:47.860507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:0000012a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.860520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:40.068 #44 NEW cov: 12501 ft: 15590 corp: 30/633b lim: 40 exec/s: 44 rss: 73Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:40.068 [2024-11-30 15:44:47.900070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01363636 cdw11:36363636 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.900094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.068 [2024-11-30 15:44:47.900168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36363636 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.900182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.068 [2024-11-30 15:44:47.900239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:36360000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.068 [2024-11-30 15:44:47.900253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.068 #45 NEW cov: 12501 ft: 15602 corp: 31/660b lim: 40 exec/s: 22 rss: 73Mb L: 27/40 MS: 1 InsertRepeatedBytes- 00:07:40.068 #45 DONE cov: 12501 ft: 15602 corp: 31/660b lim: 40 exec/s: 22 rss: 73Mb 00:07:40.068 ###### Recommended dictionary. ###### 00:07:40.068 "\000\000\000\000\000\000\000i" # Uses: 2 00:07:40.068 "\001\000\000\000\000\000\000\001" # Uses: 0 00:07:40.068 ###### End of recommended dictionary. ###### 00:07:40.068 Done 45 runs in 2 second(s) 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:40.328 15:44:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:07:40.328 [2024-11-30 15:44:48.089978] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:40.328 [2024-11-30 15:44:48.090049] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1715387 ] 00:07:40.587 [2024-11-30 15:44:48.402301] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:40.587 [2024-11-30 15:44:48.450187] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.587 [2024-11-30 15:44:48.470973] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.587 [2024-11-30 15:44:48.523447] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.587 [2024-11-30 15:44:48.539770] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:40.845 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.845 INFO: Seed: 989182317 00:07:40.845 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:40.845 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:40.846 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:40.846 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.846 #2 INITED exec/s: 0 rss: 64Mb 00:07:40.846 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.846 This may also happen if the target rejected all inputs we tried so far 00:07:40.846 [2024-11-30 15:44:48.584491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.846 [2024-11-30 15:44:48.584528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.104 NEW_FUNC[1/717]: 0x46d938 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:41.104 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:41.104 #11 NEW cov: 12276 ft: 12277 corp: 2/14b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 4 ChangeByte-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:07:41.104 [2024-11-30 15:44:48.934466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.104 [2024-11-30 15:44:48.934506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.104 #12 NEW cov: 12399 ft: 12838 corp: 3/27b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 ShuffleBytes- 00:07:41.104 [2024-11-30 15:44:49.024416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.105 [2024-11-30 15:44:49.024446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.105 #13 NEW cov: 12405 ft: 13137 corp: 4/35b lim: 40 exec/s: 0 rss: 72Mb L: 8/13 MS: 1 EraseBytes- 00:07:41.363 [2024-11-30 15:44:49.074403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.363 [2024-11-30 15:44:49.074434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.363 #14 NEW cov: 12490 ft: 13444 corp: 5/45b lim: 40 exec/s: 0 rss: 72Mb L: 10/13 MS: 1 EraseBytes- 00:07:41.363 [2024-11-30 15:44:49.124649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250aaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.363 [2024-11-30 15:44:49.124679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.363 [2024-11-30 15:44:49.124713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.363 [2024-11-30 15:44:49.124729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.363 [2024-11-30 15:44:49.124763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aa4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.363 [2024-11-30 15:44:49.124779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.363 [2024-11-30 15:44:49.124808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4e4e4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.363 [2024-11-30 15:44:49.124823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.364 #15 NEW cov: 12490 ft: 14523 corp: 6/77b lim: 40 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:41.364 [2024-11-30 15:44:49.214535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e27 cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.364 [2024-11-30 15:44:49.214566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.364 #16 NEW cov: 12490 ft: 14576 corp: 7/85b lim: 40 exec/s: 0 rss: 72Mb L: 8/32 MS: 1 ChangeByte- 00:07:41.364 [2024-11-30 15:44:49.304530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e4e40 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.364 [2024-11-30 15:44:49.304559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.622 #17 NEW cov: 12490 ft: 14625 corp: 8/98b lim: 40 exec/s: 0 rss: 72Mb L: 13/32 MS: 1 ChangeByte- 00:07:41.623 [2024-11-30 15:44:49.354500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:b2b1b1a8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.623 [2024-11-30 15:44:49.354529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.623 #18 NEW cov: 12490 ft: 14661 corp: 9/106b lim: 40 exec/s: 0 rss: 72Mb L: 8/32 MS: 1 ChangeBinInt- 00:07:41.623 [2024-11-30 15:44:49.404551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.623 [2024-11-30 15:44:49.404581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.623 #22 NEW cov: 12490 ft: 14707 corp: 10/118b lim: 40 exec/s: 0 rss: 72Mb L: 12/32 MS: 4 CrossOver-EraseBytes-ShuffleBytes-CMP- DE: "\001\000\000\000\000\000\000\001"- 00:07:41.623 [2024-11-30 15:44:49.464744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:254e4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.623 [2024-11-30 15:44:49.464773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.623 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:41.623 #23 NEW cov: 12513 ft: 14813 corp: 11/126b lim: 40 exec/s: 0 rss: 73Mb L: 8/32 MS: 1 CrossOver- 00:07:41.623 [2024-11-30 15:44:49.554684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4a4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.623 [2024-11-30 15:44:49.554714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.881 #24 NEW cov: 12513 ft: 14825 corp: 12/134b lim: 40 exec/s: 24 rss: 73Mb L: 8/32 MS: 1 ChangeBit- 00:07:41.881 [2024-11-30 15:44:49.614676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:fffffffe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.614708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.881 #25 NEW cov: 12513 ft: 14856 corp: 13/146b lim: 40 exec/s: 25 rss: 73Mb L: 12/32 MS: 1 ChangeBinInt- 00:07:41.881 [2024-11-30 15:44:49.704922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250aaaaa cdw11:aaaa31aa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.704953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.881 [2024-11-30 15:44:49.704988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.705004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.881 [2024-11-30 15:44:49.705035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aa4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.705050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.881 [2024-11-30 15:44:49.705079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4e4e4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.705094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.881 #26 NEW cov: 12513 ft: 14906 corp: 14/178b lim: 40 exec/s: 26 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:07:41.881 [2024-11-30 15:44:49.794963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a254e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.794994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.881 [2024-11-30 15:44:49.795028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4e4eaaaa cdw11:aaaaaaaa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.795044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.881 [2024-11-30 15:44:49.795074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aa4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.795089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.881 [2024-11-30 15:44:49.795118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:4e4e4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.881 [2024-11-30 15:44:49.795134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.882 #27 NEW cov: 12513 ft: 14938 corp: 15/210b lim: 40 exec/s: 27 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:07:42.139 [2024-11-30 15:44:49.854761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a0004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.139 [2024-11-30 15:44:49.854791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.139 #28 NEW cov: 12513 ft: 14942 corp: 16/223b lim: 40 exec/s: 28 rss: 73Mb L: 13/32 MS: 1 CMP- DE: "\000\004\000\000\000\000\000\000"- 00:07:42.139 [2024-11-30 15:44:49.944773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.139 [2024-11-30 15:44:49.944803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.139 #29 NEW cov: 12513 ft: 14967 corp: 17/231b lim: 40 exec/s: 29 rss: 73Mb L: 8/32 MS: 1 ShuffleBytes- 00:07:42.140 [2024-11-30 15:44:49.994777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e0800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.140 [2024-11-30 15:44:49.994812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.140 #30 NEW cov: 12513 ft: 14979 corp: 18/239b lim: 40 exec/s: 30 rss: 73Mb L: 8/32 MS: 1 ChangeBinInt- 00:07:42.140 [2024-11-30 15:44:50.054861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:255b4e4e cdw11:4e4e0800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.140 [2024-11-30 15:44:50.054900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.399 #31 NEW cov: 12513 ft: 14996 corp: 19/247b lim: 40 exec/s: 31 rss: 73Mb L: 8/32 MS: 1 ChangeByte- 00:07:42.399 [2024-11-30 15:44:50.145425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:ff7ffffe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.399 [2024-11-30 15:44:50.145459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.399 #37 NEW cov: 12513 ft: 15066 corp: 20/259b lim: 40 exec/s: 37 rss: 73Mb L: 12/32 MS: 1 ChangeBit- 00:07:42.399 [2024-11-30 15:44:50.205424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.399 [2024-11-30 15:44:50.205451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.399 #38 NEW cov: 12513 ft: 15173 corp: 21/271b lim: 40 exec/s: 38 rss: 73Mb L: 12/32 MS: 1 ChangeByte- 00:07:42.399 [2024-11-30 15:44:50.245430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e080a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.399 [2024-11-30 15:44:50.245456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.399 #39 NEW cov: 12513 ft: 15181 corp: 22/279b lim: 40 exec/s: 39 rss: 73Mb L: 8/32 MS: 1 CrossOver- 00:07:42.399 [2024-11-30 15:44:50.285454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.399 [2024-11-30 15:44:50.285480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.399 #40 NEW cov: 12513 ft: 15251 corp: 23/291b lim: 40 exec/s: 40 rss: 73Mb L: 12/32 MS: 1 CopyPart- 00:07:42.399 [2024-11-30 15:44:50.325481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4e4e4a4e cdw11:250a4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.399 [2024-11-30 15:44:50.325507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.658 #41 NEW cov: 12513 ft: 15333 corp: 24/299b lim: 40 exec/s: 41 rss: 73Mb L: 8/32 MS: 1 ShuffleBytes- 00:07:42.658 [2024-11-30 15:44:50.385514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:251a4e4e cdw11:b2b1b1a8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.658 [2024-11-30 15:44:50.385540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.658 #42 NEW cov: 12513 ft: 15370 corp: 25/307b lim: 40 exec/s: 42 rss: 73Mb L: 8/32 MS: 1 ChangeBit- 00:07:42.658 [2024-11-30 15:44:50.445536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a4e0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.658 [2024-11-30 15:44:50.445561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.658 #45 NEW cov: 12513 ft: 15383 corp: 26/319b lim: 40 exec/s: 45 rss: 73Mb L: 12/32 MS: 3 EraseBytes-CrossOver-PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:07:42.658 [2024-11-30 15:44:50.485858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4a4e cdw11:4e251a4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.658 [2024-11-30 15:44:50.485887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.658 [2024-11-30 15:44:50.485944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4eb2b1b1 cdw11:a84e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.658 [2024-11-30 15:44:50.485958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.658 #46 NEW cov: 12513 ft: 15652 corp: 27/335b lim: 40 exec/s: 46 rss: 73Mb L: 16/32 MS: 1 CrossOver- 00:07:42.658 [2024-11-30 15:44:50.525695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:254e4e0a cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.658 [2024-11-30 15:44:50.525720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.658 #47 NEW cov: 12513 ft: 15654 corp: 28/345b lim: 40 exec/s: 47 rss: 73Mb L: 10/32 MS: 1 CopyPart- 00:07:42.658 [2024-11-30 15:44:50.565858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:250a4e4e cdw11:4e4e4e4e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.659 [2024-11-30 15:44:50.565884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.659 [2024-11-30 15:44:50.565944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4e010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.659 [2024-11-30 15:44:50.565958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.659 #48 NEW cov: 12513 ft: 15664 corp: 29/366b lim: 40 exec/s: 24 rss: 73Mb L: 21/32 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:07:42.659 #48 DONE cov: 12513 ft: 15664 corp: 29/366b lim: 40 exec/s: 24 rss: 73Mb 00:07:42.659 ###### Recommended dictionary. ###### 00:07:42.659 "\001\000\000\000\000\000\000\001" # Uses: 2 00:07:42.659 "\000\004\000\000\000\000\000\000" # Uses: 0 00:07:42.659 ###### End of recommended dictionary. ###### 00:07:42.659 Done 48 runs in 2 second(s) 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:42.918 15:44:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:07:42.918 [2024-11-30 15:44:50.738605] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:42.918 [2024-11-30 15:44:50.738679] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1715912 ] 00:07:43.177 [2024-11-30 15:44:51.063408] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:43.177 [2024-11-30 15:44:51.115708] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.177 [2024-11-30 15:44:51.134899] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.435 [2024-11-30 15:44:51.187784] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.435 [2024-11-30 15:44:51.204110] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:43.435 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.435 INFO: Seed: 3653183311 00:07:43.435 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:43.435 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:43.435 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:43.435 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.435 #2 INITED exec/s: 0 rss: 65Mb 00:07:43.435 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.435 This may also happen if the target rejected all inputs we tried so far 00:07:43.436 [2024-11-30 15:44:51.275224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5d3342 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.436 [2024-11-30 15:44:51.275263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.695 NEW_FUNC[1/717]: 0x46f6a8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:43.695 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.695 #3 NEW cov: 12284 ft: 12269 corp: 2/10b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "]3B\253\277\306\224\000"- 00:07:43.955 [2024-11-30 15:44:51.664249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a5d33 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.955 [2024-11-30 15:44:51.664300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.955 #4 NEW cov: 12397 ft: 13033 corp: 3/20b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 CrossOver- 00:07:43.955 [2024-11-30 15:44:51.714376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5d3342 cdw11:5d3342ab SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.955 [2024-11-30 15:44:51.714404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.955 [2024-11-30 15:44:51.714539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:bfc69400 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.955 [2024-11-30 15:44:51.714554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.955 #5 NEW cov: 12403 ft: 13921 corp: 4/37b lim: 40 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 PersAutoDict- DE: "]3B\253\277\306\224\000"- 00:07:43.955 [2024-11-30 15:44:51.784180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7a5d33 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.955 [2024-11-30 15:44:51.784210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.955 #6 NEW cov: 12488 ft: 14247 corp: 5/47b lim: 40 exec/s: 0 rss: 72Mb L: 10/17 MS: 1 InsertByte- 00:07:43.955 [2024-11-30 15:44:51.834221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7a5d33 cdw11:42c6bfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.955 [2024-11-30 15:44:51.834247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.955 #7 NEW cov: 12488 ft: 14296 corp: 6/57b lim: 40 exec/s: 0 rss: 72Mb L: 10/17 MS: 1 CrossOver- 00:07:43.955 [2024-11-30 15:44:51.904499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5d3342 cdw11:5d3348ab SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.955 [2024-11-30 15:44:51.904526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.955 [2024-11-30 15:44:51.904668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:bfc69400 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.955 [2024-11-30 15:44:51.904686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.213 #8 NEW cov: 12488 ft: 14464 corp: 7/74b lim: 40 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 ChangeBinInt- 00:07:44.213 [2024-11-30 15:44:51.974285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a5d33 cdw11:425d3342 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.213 [2024-11-30 15:44:51.974312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.213 #9 NEW cov: 12488 ft: 14564 corp: 8/87b lim: 40 exec/s: 0 rss: 73Mb L: 13/17 MS: 1 CrossOver- 00:07:44.213 [2024-11-30 15:44:52.024267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:085d3342 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.213 [2024-11-30 15:44:52.024293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.213 #10 NEW cov: 12488 ft: 14609 corp: 9/96b lim: 40 exec/s: 0 rss: 73Mb L: 9/17 MS: 1 ChangeBit- 00:07:44.213 [2024-11-30 15:44:52.074836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a7777 cdw11:77777777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.213 [2024-11-30 15:44:52.074862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.213 [2024-11-30 15:44:52.074987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77777777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.213 [2024-11-30 15:44:52.075002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.213 [2024-11-30 15:44:52.075127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77775d33 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.213 [2024-11-30 15:44:52.075145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.213 #11 NEW cov: 12488 ft: 14875 corp: 10/126b lim: 40 exec/s: 0 rss: 73Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:44.213 [2024-11-30 15:44:52.144681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7a5d33 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.213 [2024-11-30 15:44:52.144710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.213 [2024-11-30 15:44:52.144845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:7a5d3342 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.213 [2024-11-30 15:44:52.144866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.213 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:44.213 #12 NEW cov: 12511 ft: 14931 corp: 11/145b lim: 40 exec/s: 0 rss: 73Mb L: 19/30 MS: 1 CopyPart- 00:07:44.472 [2024-11-30 15:44:52.204769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5d3342 cdw11:5d3348ab SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.204798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 [2024-11-30 15:44:52.204933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:bfc69400 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.204951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.472 #13 NEW cov: 12511 ft: 15010 corp: 12/163b lim: 40 exec/s: 13 rss: 73Mb L: 18/30 MS: 1 InsertByte- 00:07:44.472 [2024-11-30 15:44:52.274810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7a5d33 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.274838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 [2024-11-30 15:44:52.274969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94005d33 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.274987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.472 #14 NEW cov: 12511 ft: 15022 corp: 13/181b lim: 40 exec/s: 14 rss: 73Mb L: 18/30 MS: 1 PersAutoDict- DE: "]3B\253\277\306\224\000"- 00:07:44.472 [2024-11-30 15:44:52.325374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a7777 cdw11:77777748 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.325400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 [2024-11-30 15:44:52.325517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:abbfc694 cdw11:00abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.325532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.472 [2024-11-30 15:44:52.325661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77777777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.325677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.472 [2024-11-30 15:44:52.325796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:77777777 cdw11:7777775d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.325812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.472 #15 NEW cov: 12511 ft: 15362 corp: 14/220b lim: 40 exec/s: 15 rss: 73Mb L: 39/39 MS: 1 CrossOver- 00:07:44.472 [2024-11-30 15:44:52.394612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:530b0a7a cdw11:5d3342c6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.472 [2024-11-30 15:44:52.394639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 #20 NEW cov: 12511 ft: 15389 corp: 15/233b lim: 40 exec/s: 20 rss: 73Mb L: 13/39 MS: 5 CrossOver-EraseBytes-InsertByte-ChangeBinInt-CrossOver- 00:07:44.731 [2024-11-30 15:44:52.465096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.465127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.731 [2024-11-30 15:44:52.465260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.465276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.731 [2024-11-30 15:44:52.465390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:0a5d3342 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.465406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.731 [2024-11-30 15:44:52.465527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:5d3342ab cdw11:bfc69400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.465543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.731 #21 NEW cov: 12511 ft: 15507 corp: 16/270b lim: 40 exec/s: 21 rss: 73Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:07:44.731 [2024-11-30 15:44:52.514611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9a9a9a9a cdw11:9a9a9a0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.514639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.731 #22 NEW cov: 12511 ft: 15521 corp: 17/278b lim: 40 exec/s: 22 rss: 73Mb L: 8/39 MS: 1 InsertRepeatedBytes- 00:07:44.731 [2024-11-30 15:44:52.564675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a5d33 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.564702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.731 #23 NEW cov: 12511 ft: 15572 corp: 18/289b lim: 40 exec/s: 23 rss: 73Mb L: 11/39 MS: 1 InsertByte- 00:07:44.731 [2024-11-30 15:44:52.614790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a5d33 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.614817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.731 #24 NEW cov: 12511 ft: 15582 corp: 19/299b lim: 40 exec/s: 24 rss: 73Mb L: 10/39 MS: 1 ChangeBit- 00:07:44.731 [2024-11-30 15:44:52.665662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a7777 cdw11:77777748 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.665690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.731 [2024-11-30 15:44:52.665813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:abbfc694 cdw11:00abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.665830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.731 [2024-11-30 15:44:52.665960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77777777 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.665977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.731 [2024-11-30 15:44:52.666101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:77777777 cdw11:7777775d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.731 [2024-11-30 15:44:52.666118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.991 #25 NEW cov: 12511 ft: 15597 corp: 20/338b lim: 40 exec/s: 25 rss: 73Mb L: 39/39 MS: 1 ChangeByte- 00:07:44.991 [2024-11-30 15:44:52.734888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5d7342 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.991 [2024-11-30 15:44:52.734916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.991 #26 NEW cov: 12511 ft: 15622 corp: 21/347b lim: 40 exec/s: 26 rss: 73Mb L: 9/39 MS: 1 ChangeBit- 00:07:44.991 [2024-11-30 15:44:52.785212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7a5d33 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.991 [2024-11-30 15:44:52.785239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.991 [2024-11-30 15:44:52.785358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:7a000942 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.991 [2024-11-30 15:44:52.785375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.991 #27 NEW cov: 12511 ft: 15633 corp: 22/366b lim: 40 exec/s: 27 rss: 73Mb L: 19/39 MS: 1 CMP- DE: "\000\011"- 00:07:44.991 [2024-11-30 15:44:52.854990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9a9a9a9a cdw11:9a9a9a9a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.991 [2024-11-30 15:44:52.855019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.991 #28 NEW cov: 12511 ft: 15679 corp: 23/374b lim: 40 exec/s: 28 rss: 73Mb L: 8/39 MS: 1 CopyPart- 00:07:44.991 [2024-11-30 15:44:52.924954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a0033 cdw11:425d3342 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.991 [2024-11-30 15:44:52.924982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.250 #29 NEW cov: 12511 ft: 15694 corp: 24/387b lim: 40 exec/s: 29 rss: 73Mb L: 13/39 MS: 1 CrossOver- 00:07:45.250 [2024-11-30 15:44:52.995008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5d3342 cdw11:ab33c694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.250 [2024-11-30 15:44:52.995036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.250 #30 NEW cov: 12511 ft: 15710 corp: 25/396b lim: 40 exec/s: 30 rss: 74Mb L: 9/39 MS: 1 CrossOver- 00:07:45.250 [2024-11-30 15:44:53.045887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a7777 cdw11:77777748 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.250 [2024-11-30 15:44:53.045914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.250 [2024-11-30 15:44:53.046038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:abbfc694 cdw11:00abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.250 [2024-11-30 15:44:53.046055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.250 [2024-11-30 15:44:53.046179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:77777700 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.250 [2024-11-30 15:44:53.046195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.250 [2024-11-30 15:44:53.046326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00002777 cdw11:7777775d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.250 [2024-11-30 15:44:53.046341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.250 #31 NEW cov: 12511 ft: 15719 corp: 26/435b lim: 40 exec/s: 31 rss: 74Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:45.250 [2024-11-30 15:44:53.115040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0a5d37 cdw11:42abbfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.250 [2024-11-30 15:44:53.115070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.250 #32 NEW cov: 12511 ft: 15735 corp: 27/445b lim: 40 exec/s: 32 rss: 74Mb L: 10/39 MS: 1 ChangeBit- 00:07:45.250 [2024-11-30 15:44:53.165099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7a7d33 cdw11:42c6bfc6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.250 [2024-11-30 15:44:53.165125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.250 #33 NEW cov: 12511 ft: 15744 corp: 28/455b lim: 40 exec/s: 33 rss: 74Mb L: 10/39 MS: 1 ChangeBit- 00:07:45.510 [2024-11-30 15:44:53.235114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5de242 cdw11:abbfc694 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.510 [2024-11-30 15:44:53.235140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.510 #34 NEW cov: 12511 ft: 15747 corp: 29/464b lim: 40 exec/s: 17 rss: 74Mb L: 9/39 MS: 1 ChangeByte- 00:07:45.510 #34 DONE cov: 12511 ft: 15747 corp: 29/464b lim: 40 exec/s: 17 rss: 74Mb 00:07:45.510 ###### Recommended dictionary. ###### 00:07:45.510 "]3B\253\277\306\224\000" # Uses: 2 00:07:45.510 "\000\011" # Uses: 0 00:07:45.510 ###### End of recommended dictionary. ###### 00:07:45.510 Done 34 runs in 2 second(s) 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:45.510 15:44:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:07:45.510 [2024-11-30 15:44:53.420259] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:45.510 [2024-11-30 15:44:53.420324] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1716451 ] 00:07:45.769 [2024-11-30 15:44:53.731643] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:46.028 [2024-11-30 15:44:53.778065] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.028 [2024-11-30 15:44:53.800562] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.028 [2024-11-30 15:44:53.852702] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.028 [2024-11-30 15:44:53.869012] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:46.028 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.028 INFO: Seed: 2025208015 00:07:46.028 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:46.028 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:46.028 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:46.028 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.028 #2 INITED exec/s: 0 rss: 64Mb 00:07:46.028 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.028 This may also happen if the target rejected all inputs we tried so far 00:07:46.028 [2024-11-30 15:44:53.913699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.028 [2024-11-30 15:44:53.913733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.597 NEW_FUNC[1/716]: 0x471278 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:46.597 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.597 #11 NEW cov: 12272 ft: 12258 corp: 2/13b lim: 40 exec/s: 0 rss: 72Mb L: 12/12 MS: 4 InsertByte-ChangeByte-EraseBytes-InsertRepeatedBytes- 00:07:46.597 [2024-11-30 15:44:54.285052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffff11 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.597 [2024-11-30 15:44:54.285118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.597 #12 NEW cov: 12385 ft: 13068 corp: 3/25b lim: 40 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 CMP- DE: "\021\000\000\000\000\000\000\000"- 00:07:46.597 [2024-11-30 15:44:54.355021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.597 [2024-11-30 15:44:54.355052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.597 #13 NEW cov: 12391 ft: 13313 corp: 4/37b lim: 40 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 CopyPart- 00:07:46.597 [2024-11-30 15:44:54.394996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffffff cdw11:ffffff2d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.597 [2024-11-30 15:44:54.395026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.597 #14 NEW cov: 12476 ft: 13621 corp: 5/50b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 InsertByte- 00:07:46.597 [2024-11-30 15:44:54.454972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.597 [2024-11-30 15:44:54.455001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.597 #15 NEW cov: 12476 ft: 13683 corp: 6/62b lim: 40 exec/s: 0 rss: 72Mb L: 12/13 MS: 1 CopyPart- 00:07:46.597 [2024-11-30 15:44:54.495277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25110000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.597 [2024-11-30 15:44:54.495306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.597 [2024-11-30 15:44:54.495428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffff11 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.597 [2024-11-30 15:44:54.495446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.597 #16 NEW cov: 12476 ft: 14007 corp: 7/82b lim: 40 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 PersAutoDict- DE: "\021\000\000\000\000\000\000\000"- 00:07:46.597 [2024-11-30 15:44:54.555098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:32ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.597 [2024-11-30 15:44:54.555126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.856 #17 NEW cov: 12476 ft: 14077 corp: 8/94b lim: 40 exec/s: 0 rss: 72Mb L: 12/20 MS: 1 ChangeByte- 00:07:46.856 [2024-11-30 15:44:54.615000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:32ffffff cdw11:ff5bffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.856 [2024-11-30 15:44:54.615028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.856 #18 NEW cov: 12476 ft: 14101 corp: 9/106b lim: 40 exec/s: 0 rss: 72Mb L: 12/20 MS: 1 ChangeByte- 00:07:46.856 [2024-11-30 15:44:54.675059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffff11 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.856 [2024-11-30 15:44:54.675085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.856 #19 NEW cov: 12476 ft: 14219 corp: 10/118b lim: 40 exec/s: 0 rss: 72Mb L: 12/20 MS: 1 CrossOver- 00:07:46.856 [2024-11-30 15:44:54.715214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffff11 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.856 [2024-11-30 15:44:54.715243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.856 [2024-11-30 15:44:54.715369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:11000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.856 [2024-11-30 15:44:54.715385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.856 #20 NEW cov: 12476 ft: 14290 corp: 11/138b lim: 40 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 PersAutoDict- DE: "\021\000\000\000\000\000\000\000"- 00:07:46.856 [2024-11-30 15:44:54.775475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffff11 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.856 [2024-11-30 15:44:54.775501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.856 [2024-11-30 15:44:54.775628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:80000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.856 [2024-11-30 15:44:54.775654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.856 [2024-11-30 15:44:54.775776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:11000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.856 [2024-11-30 15:44:54.775792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.856 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:46.856 #21 NEW cov: 12493 ft: 14557 corp: 12/166b lim: 40 exec/s: 0 rss: 73Mb L: 28/28 MS: 1 CMP- DE: "\200\000\000\000\000\000\000\000"- 00:07:47.115 [2024-11-30 15:44:54.835082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2525ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:54.835109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.116 #22 NEW cov: 12493 ft: 14624 corp: 13/178b lim: 40 exec/s: 0 rss: 73Mb L: 12/28 MS: 1 CrossOver- 00:07:47.116 [2024-11-30 15:44:54.875750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:251100ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:54.875777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.116 [2024-11-30 15:44:54.875902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:54.875918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.116 [2024-11-30 15:44:54.876043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffff11 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:54.876061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.116 [2024-11-30 15:44:54.876192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:54.876208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.116 #23 NEW cov: 12493 ft: 15128 corp: 14/210b lim: 40 exec/s: 23 rss: 73Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:47.116 [2024-11-30 15:44:54.935173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ff0812 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:54.935200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.116 #24 NEW cov: 12493 ft: 15132 corp: 15/222b lim: 40 exec/s: 24 rss: 73Mb L: 12/32 MS: 1 ChangeBinInt- 00:07:47.116 [2024-11-30 15:44:54.975207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffffff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:54.975232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.116 #25 NEW cov: 12493 ft: 15161 corp: 16/235b lim: 40 exec/s: 25 rss: 73Mb L: 13/32 MS: 1 InsertByte- 00:07:47.116 [2024-11-30 15:44:55.015282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25110000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:55.015308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.116 #26 NEW cov: 12493 ft: 15169 corp: 17/248b lim: 40 exec/s: 26 rss: 73Mb L: 13/32 MS: 1 PersAutoDict- DE: "\021\000\000\000\000\000\000\000"- 00:07:47.116 [2024-11-30 15:44:55.075326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:11000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.116 [2024-11-30 15:44:55.075354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.375 #27 NEW cov: 12493 ft: 15183 corp: 18/260b lim: 40 exec/s: 27 rss: 73Mb L: 12/32 MS: 1 PersAutoDict- DE: "\021\000\000\000\000\000\000\000"- 00:07:47.375 [2024-11-30 15:44:55.115261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:32ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.375 [2024-11-30 15:44:55.115287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.376 #28 NEW cov: 12493 ft: 15241 corp: 19/272b lim: 40 exec/s: 28 rss: 73Mb L: 12/32 MS: 1 ShuffleBytes- 00:07:47.376 [2024-11-30 15:44:55.155267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0cffff11 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.155294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.376 #29 NEW cov: 12493 ft: 15274 corp: 20/284b lim: 40 exec/s: 29 rss: 73Mb L: 12/32 MS: 1 ChangeBinInt- 00:07:47.376 [2024-11-30 15:44:55.195798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.195825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.195951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.195968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.196092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.196109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.376 #32 NEW cov: 12493 ft: 15305 corp: 21/314b lim: 40 exec/s: 32 rss: 73Mb L: 30/32 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:47.376 [2024-11-30 15:44:55.236056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25535353 cdw11:53535353 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.236083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.236213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:53535353 cdw11:53110000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.236229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.236351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffff11 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.236369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.236493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.236509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.376 #33 NEW cov: 12493 ft: 15371 corp: 22/346b lim: 40 exec/s: 33 rss: 73Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:47.376 [2024-11-30 15:44:55.275781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:c9c9c9c9 cdw11:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.275807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.275939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:c9c9c9c9 cdw11:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.275955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.276072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:c9c925ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.276088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.376 #34 NEW cov: 12493 ft: 15385 corp: 23/376b lim: 40 exec/s: 34 rss: 73Mb L: 30/32 MS: 1 InsertRepeatedBytes- 00:07:47.376 [2024-11-30 15:44:55.315727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:c9c999c9 cdw11:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.315753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.315868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:c9c9c9c9 cdw11:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.315887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.376 [2024-11-30 15:44:55.316017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:c9c925ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.376 [2024-11-30 15:44:55.316033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.635 #35 NEW cov: 12493 ft: 15401 corp: 24/406b lim: 40 exec/s: 35 rss: 73Mb L: 30/32 MS: 1 ChangeByte- 00:07:47.635 [2024-11-30 15:44:55.376032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2525ffff cdw11:ff5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.376058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.635 [2024-11-30 15:44:55.376183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.376199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.635 [2024-11-30 15:44:55.376332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:5c5c5c5c cdw11:5c5c5c5c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.376354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.635 [2024-11-30 15:44:55.376516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:5c5c5c5c cdw11:5c5cffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.376539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.635 #36 NEW cov: 12493 ft: 15478 corp: 25/443b lim: 40 exec/s: 36 rss: 73Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:47.635 [2024-11-30 15:44:55.435437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.435467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.635 #37 NEW cov: 12493 ft: 15490 corp: 26/452b lim: 40 exec/s: 37 rss: 73Mb L: 9/37 MS: 1 EraseBytes- 00:07:47.635 [2024-11-30 15:44:55.496094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.496124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.635 [2024-11-30 15:44:55.496244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.496262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.635 [2024-11-30 15:44:55.496383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:bebe2511 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.496400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.635 [2024-11-30 15:44:55.496523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:ff110000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.496538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.635 #38 NEW cov: 12493 ft: 15522 corp: 27/490b lim: 40 exec/s: 38 rss: 73Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:47.635 [2024-11-30 15:44:55.535663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25110000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.535690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.635 [2024-11-30 15:44:55.535823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffff11 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.635 [2024-11-30 15:44:55.535840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.636 #39 NEW cov: 12493 ft: 15559 corp: 28/510b lim: 40 exec/s: 39 rss: 73Mb L: 20/38 MS: 1 ShuffleBytes- 00:07:47.636 [2024-11-30 15:44:55.575459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ff1100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.636 [2024-11-30 15:44:55.575487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.636 #40 NEW cov: 12493 ft: 15581 corp: 29/522b lim: 40 exec/s: 40 rss: 73Mb L: 12/38 MS: 1 PersAutoDict- DE: "\021\000\000\000\000\000\000\000"- 00:07:47.896 [2024-11-30 15:44:55.615515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.615542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.896 #41 NEW cov: 12493 ft: 15612 corp: 30/534b lim: 40 exec/s: 41 rss: 73Mb L: 12/38 MS: 1 ChangeBinInt- 00:07:47.896 [2024-11-30 15:44:55.655604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffffff cdw11:fffffffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.655631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.896 #42 NEW cov: 12493 ft: 15618 corp: 31/546b lim: 40 exec/s: 42 rss: 73Mb L: 12/38 MS: 1 ChangeBit- 00:07:47.896 [2024-11-30 15:44:55.695660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffffff cdw11:fffffffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.695689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.896 #43 NEW cov: 12493 ft: 15670 corp: 32/558b lim: 40 exec/s: 43 rss: 73Mb L: 12/38 MS: 1 ChangeBinInt- 00:07:47.896 [2024-11-30 15:44:55.756234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.756266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.896 [2024-11-30 15:44:55.756402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.756419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.896 [2024-11-30 15:44:55.756544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.756560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.896 [2024-11-30 15:44:55.756685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.756703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.896 #45 NEW cov: 12493 ft: 15679 corp: 33/593b lim: 40 exec/s: 45 rss: 73Mb L: 35/38 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:47.896 [2024-11-30 15:44:55.795867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25ff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.795897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.896 [2024-11-30 15:44:55.796037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff110000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.796054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.896 #46 NEW cov: 12500 ft: 15726 corp: 34/615b lim: 40 exec/s: 46 rss: 73Mb L: 22/38 MS: 1 CrossOver- 00:07:47.896 [2024-11-30 15:44:55.835534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:250000ff cdw11:ffff1100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.896 [2024-11-30 15:44:55.835560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.896 #47 NEW cov: 12500 ft: 15735 corp: 35/627b lim: 40 exec/s: 47 rss: 73Mb L: 12/38 MS: 1 ShuffleBytes- 00:07:48.156 [2024-11-30 15:44:55.876394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:25535353 cdw11:53535353 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.156 [2024-11-30 15:44:55.876422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.156 [2024-11-30 15:44:55.876556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:53535300 cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.156 [2024-11-30 15:44:55.876572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.156 [2024-11-30 15:44:55.876704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:11005353 cdw11:11000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.156 [2024-11-30 15:44:55.876720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.156 [2024-11-30 15:44:55.876850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffff1100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.156 [2024-11-30 15:44:55.876865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.156 #48 NEW cov: 12500 ft: 15744 corp: 36/666b lim: 40 exec/s: 24 rss: 74Mb L: 39/39 MS: 1 CopyPart- 00:07:48.156 #48 DONE cov: 12500 ft: 15744 corp: 36/666b lim: 40 exec/s: 24 rss: 74Mb 00:07:48.156 ###### Recommended dictionary. ###### 00:07:48.156 "\021\000\000\000\000\000\000\000" # Uses: 5 00:07:48.156 "\200\000\000\000\000\000\000\000" # Uses: 0 00:07:48.156 ###### End of recommended dictionary. ###### 00:07:48.156 Done 48 runs in 2 second(s) 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:48.156 15:44:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:07:48.156 [2024-11-30 15:44:56.069014] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:48.156 [2024-11-30 15:44:56.069076] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1716913 ] 00:07:48.725 [2024-11-30 15:44:56.383384] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:48.725 [2024-11-30 15:44:56.430030] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.725 [2024-11-30 15:44:56.452848] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.725 [2024-11-30 15:44:56.505045] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.725 [2024-11-30 15:44:56.521360] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:48.725 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.725 INFO: Seed: 380236960 00:07:48.725 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:48.725 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:48.725 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:48.725 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.725 #2 INITED exec/s: 0 rss: 64Mb 00:07:48.725 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.725 This may also happen if the target rejected all inputs we tried so far 00:07:48.725 [2024-11-30 15:44:56.570769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.725 [2024-11-30 15:44:56.570798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.725 [2024-11-30 15:44:56.570856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.725 [2024-11-30 15:44:56.570871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.725 [2024-11-30 15:44:56.570928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.725 [2024-11-30 15:44:56.570943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.725 [2024-11-30 15:44:56.570997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.725 [2024-11-30 15:44:56.571012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.984 NEW_FUNC[1/719]: 0x472e48 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:48.984 NEW_FUNC[2/719]: 0x494398 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:48.984 #5 NEW cov: 12299 ft: 12298 corp: 2/36b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:07:48.984 [2024-11-30 15:44:56.900245] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.984 [2024-11-30 15:44:56.900283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.984 #8 NEW cov: 12412 ft: 13840 corp: 3/43b lim: 35 exec/s: 0 rss: 72Mb L: 7/35 MS: 3 CrossOver-InsertRepeatedBytes-CopyPart- 00:07:49.244 [2024-11-30 15:44:56.950149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:56.950178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.244 #9 NEW cov: 12418 ft: 14046 corp: 4/50b lim: 35 exec/s: 0 rss: 72Mb L: 7/35 MS: 1 ChangeBinInt- 00:07:49.244 [2024-11-30 15:44:57.010186] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.010214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.244 #10 NEW cov: 12503 ft: 14301 corp: 5/58b lim: 35 exec/s: 0 rss: 72Mb L: 8/35 MS: 1 InsertByte- 00:07:49.244 [2024-11-30 15:44:57.070212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.070240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.244 #11 NEW cov: 12503 ft: 14400 corp: 6/65b lim: 35 exec/s: 0 rss: 72Mb L: 7/35 MS: 1 CMP- DE: "\000\000"- 00:07:49.244 [2024-11-30 15:44:57.110716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.110742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.244 [2024-11-30 15:44:57.110819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.110836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.244 [2024-11-30 15:44:57.110896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.110909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.244 [2024-11-30 15:44:57.110967] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:7 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.110984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.244 #12 NEW cov: 12510 ft: 14532 corp: 7/93b lim: 35 exec/s: 0 rss: 72Mb L: 28/35 MS: 1 InsertRepeatedBytes- 00:07:49.244 [2024-11-30 15:44:57.150717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.150743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.244 [2024-11-30 15:44:57.150806] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.150819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.244 [2024-11-30 15:44:57.150877] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.150890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.244 [2024-11-30 15:44:57.150946] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:7 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.244 [2024-11-30 15:44:57.150962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.244 #13 NEW cov: 12510 ft: 14608 corp: 8/121b lim: 35 exec/s: 0 rss: 72Mb L: 28/35 MS: 1 CMP- DE: "\007\000\000\000"- 00:07:49.504 [2024-11-30 15:44:57.210303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.210331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.504 #14 NEW cov: 12510 ft: 14674 corp: 9/131b lim: 35 exec/s: 0 rss: 72Mb L: 10/35 MS: 1 CopyPart- 00:07:49.504 [2024-11-30 15:44:57.250793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.250820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.504 [2024-11-30 15:44:57.250879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.250893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.504 [2024-11-30 15:44:57.250950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.250963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.504 [2024-11-30 15:44:57.251022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.251035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.504 #15 NEW cov: 12510 ft: 14722 corp: 10/163b lim: 35 exec/s: 0 rss: 72Mb L: 32/35 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:07:49.504 [2024-11-30 15:44:57.310311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.310339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.504 #16 NEW cov: 12510 ft: 14758 corp: 11/170b lim: 35 exec/s: 0 rss: 72Mb L: 7/35 MS: 1 CopyPart- 00:07:49.504 [2024-11-30 15:44:57.350312] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.350340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.504 #17 NEW cov: 12510 ft: 14782 corp: 12/177b lim: 35 exec/s: 0 rss: 73Mb L: 7/35 MS: 1 CopyPart- 00:07:49.504 [2024-11-30 15:44:57.411073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.411100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.504 [2024-11-30 15:44:57.411175] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.411191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.504 [2024-11-30 15:44:57.411250] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.411267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.504 [2024-11-30 15:44:57.411325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.504 [2024-11-30 15:44:57.411341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.504 #18 NEW cov: 12510 ft: 14791 corp: 13/212b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:49.764 [2024-11-30 15:44:57.470937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.764 [2024-11-30 15:44:57.470965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.764 [2024-11-30 15:44:57.471028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.764 [2024-11-30 15:44:57.471045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.764 [2024-11-30 15:44:57.471102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.764 [2024-11-30 15:44:57.471116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.764 [2024-11-30 15:44:57.471177] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.764 [2024-11-30 15:44:57.471191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.764 #19 NEW cov: 12510 ft: 14832 corp: 14/240b lim: 35 exec/s: 0 rss: 73Mb L: 28/35 MS: 1 CrossOver- 00:07:49.764 [2024-11-30 15:44:57.510377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.764 [2024-11-30 15:44:57.510405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.764 #20 NEW cov: 12510 ft: 14854 corp: 15/250b lim: 35 exec/s: 20 rss: 73Mb L: 10/35 MS: 1 ChangeByte- 00:07:49.764 [2024-11-30 15:44:57.570424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.764 [2024-11-30 15:44:57.570452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.764 #21 NEW cov: 12510 ft: 14873 corp: 16/258b lim: 35 exec/s: 21 rss: 73Mb L: 8/35 MS: 1 CopyPart- 00:07:49.764 [2024-11-30 15:44:57.630481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.764 [2024-11-30 15:44:57.630507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.764 #24 NEW cov: 12510 ft: 14920 corp: 17/270b lim: 35 exec/s: 24 rss: 73Mb L: 12/35 MS: 3 ShuffleBytes-CrossOver-CrossOver- 00:07:49.764 [2024-11-30 15:44:57.670471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.764 [2024-11-30 15:44:57.670499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.764 #25 NEW cov: 12510 ft: 14932 corp: 18/280b lim: 35 exec/s: 25 rss: 73Mb L: 10/35 MS: 1 ChangeByte- 00:07:50.023 [2024-11-30 15:44:57.730547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.023 [2024-11-30 15:44:57.730576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.023 #26 NEW cov: 12510 ft: 14948 corp: 19/290b lim: 35 exec/s: 26 rss: 73Mb L: 10/35 MS: 1 ShuffleBytes- 00:07:50.024 [2024-11-30 15:44:57.770992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.024 [2024-11-30 15:44:57.771018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.024 [2024-11-30 15:44:57.771081] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.024 [2024-11-30 15:44:57.771095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.024 [2024-11-30 15:44:57.771154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.024 [2024-11-30 15:44:57.771168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.024 [2024-11-30 15:44:57.771227] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000004f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.024 [2024-11-30 15:44:57.771241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.024 #27 NEW cov: 12510 ft: 14970 corp: 20/322b lim: 35 exec/s: 27 rss: 73Mb L: 32/35 MS: 1 ChangeBit- 00:07:50.024 [2024-11-30 15:44:57.830547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.024 [2024-11-30 15:44:57.830575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.024 #28 NEW cov: 12510 ft: 14980 corp: 21/332b lim: 35 exec/s: 28 rss: 73Mb L: 10/35 MS: 1 ChangeBit- 00:07:50.024 [2024-11-30 15:44:57.870594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.024 [2024-11-30 15:44:57.870627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.024 #29 NEW cov: 12510 ft: 15006 corp: 22/339b lim: 35 exec/s: 29 rss: 73Mb L: 7/35 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:50.024 [2024-11-30 15:44:57.930571] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.024 [2024-11-30 15:44:57.930603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.024 #30 NEW cov: 12510 ft: 15023 corp: 23/349b lim: 35 exec/s: 30 rss: 73Mb L: 10/35 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:07:50.024 [2024-11-30 15:44:57.970580] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.024 [2024-11-30 15:44:57.970611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.284 #31 NEW cov: 12510 ft: 15067 corp: 24/357b lim: 35 exec/s: 31 rss: 73Mb L: 8/35 MS: 1 InsertByte- 00:07:50.284 #32 NEW cov: 12510 ft: 15079 corp: 25/367b lim: 35 exec/s: 32 rss: 73Mb L: 10/35 MS: 1 ChangeBinInt- 00:07:50.284 [2024-11-30 15:44:58.090685] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.284 [2024-11-30 15:44:58.090719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.284 #33 NEW cov: 12510 ft: 15096 corp: 26/375b lim: 35 exec/s: 33 rss: 74Mb L: 8/35 MS: 1 ChangeBit- 00:07:50.284 [2024-11-30 15:44:58.150704] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.284 [2024-11-30 15:44:58.150729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.284 #34 NEW cov: 12510 ft: 15105 corp: 27/387b lim: 35 exec/s: 34 rss: 74Mb L: 12/35 MS: 1 CrossOver- 00:07:50.284 #35 NEW cov: 12510 ft: 15130 corp: 28/395b lim: 35 exec/s: 35 rss: 74Mb L: 8/35 MS: 1 CrossOver- 00:07:50.543 [2024-11-30 15:44:58.250928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.543 [2024-11-30 15:44:58.250954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.543 [2024-11-30 15:44:58.251015] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.543 [2024-11-30 15:44:58.251032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.543 #36 NEW cov: 12510 ft: 15317 corp: 29/412b lim: 35 exec/s: 36 rss: 74Mb L: 17/35 MS: 1 CopyPart- 00:07:50.544 [2024-11-30 15:44:58.310788] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.544 [2024-11-30 15:44:58.310816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.544 [2024-11-30 15:44:58.370784] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.544 [2024-11-30 15:44:58.370810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.544 #38 NEW cov: 12510 ft: 15330 corp: 30/419b lim: 35 exec/s: 38 rss: 74Mb L: 7/35 MS: 2 CrossOver-ShuffleBytes- 00:07:50.544 [2024-11-30 15:44:58.410805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.544 [2024-11-30 15:44:58.410832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.544 #42 NEW cov: 12510 ft: 15346 corp: 31/426b lim: 35 exec/s: 42 rss: 74Mb L: 7/35 MS: 4 EraseBytes-ShuffleBytes-ShuffleBytes-InsertByte- 00:07:50.544 [2024-11-30 15:44:58.450855] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.544 [2024-11-30 15:44:58.450883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.544 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:50.544 #43 NEW cov: 12533 ft: 15409 corp: 32/433b lim: 35 exec/s: 43 rss: 74Mb L: 7/35 MS: 1 ChangeBit- 00:07:50.803 [2024-11-30 15:44:58.511077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.803 [2024-11-30 15:44:58.511104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.804 [2024-11-30 15:44:58.511168] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.804 [2024-11-30 15:44:58.511186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.804 #44 NEW cov: 12533 ft: 15426 corp: 33/451b lim: 35 exec/s: 22 rss: 74Mb L: 18/35 MS: 1 CopyPart- 00:07:50.804 #44 DONE cov: 12533 ft: 15426 corp: 33/451b lim: 35 exec/s: 22 rss: 74Mb 00:07:50.804 ###### Recommended dictionary. ###### 00:07:50.804 "\000\000" # Uses: 1 00:07:50.804 "\007\000\000\000" # Uses: 2 00:07:50.804 ###### End of recommended dictionary. ###### 00:07:50.804 Done 44 runs in 2 second(s) 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:50.804 15:44:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:07:50.804 [2024-11-30 15:44:58.700768] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:50.804 [2024-11-30 15:44:58.700838] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1717273 ] 00:07:51.063 [2024-11-30 15:44:59.015948] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:51.323 [2024-11-30 15:44:59.062838] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.323 [2024-11-30 15:44:59.078902] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.323 [2024-11-30 15:44:59.131687] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.323 [2024-11-30 15:44:59.147983] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:51.323 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.323 INFO: Seed: 3009243737 00:07:51.323 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:51.323 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:51.323 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:51.323 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.323 #2 INITED exec/s: 0 rss: 64Mb 00:07:51.323 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.323 This may also happen if the target rejected all inputs we tried so far 00:07:51.323 [2024-11-30 15:44:59.225910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.323 [2024-11-30 15:44:59.225950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.323 [2024-11-30 15:44:59.226033] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.323 [2024-11-30 15:44:59.226048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.323 [2024-11-30 15:44:59.226126] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.323 [2024-11-30 15:44:59.226141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.323 [2024-11-30 15:44:59.226223] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.323 [2024-11-30 15:44:59.226239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.583 NEW_FUNC[1/716]: 0x474388 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:51.583 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.583 #9 NEW cov: 12236 ft: 12237 corp: 2/33b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:51.842 [2024-11-30 15:44:59.554126] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.842 [2024-11-30 15:44:59.554173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 #14 NEW cov: 12366 ft: 13554 corp: 3/41b lim: 35 exec/s: 0 rss: 72Mb L: 8/32 MS: 5 ChangeBit-ChangeBit-CMP-ShuffleBytes-CopyPart- DE: "\000\000\000\000"- 00:07:51.842 [2024-11-30 15:44:59.614230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.842 [2024-11-30 15:44:59.614260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 [2024-11-30 15:44:59.614389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.842 [2024-11-30 15:44:59.614406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.842 #15 NEW cov: 12372 ft: 13983 corp: 4/59b lim: 35 exec/s: 0 rss: 72Mb L: 18/32 MS: 1 EraseBytes- 00:07:51.842 [2024-11-30 15:44:59.673991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.842 [2024-11-30 15:44:59.674021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 #20 NEW cov: 12457 ft: 14293 corp: 5/67b lim: 35 exec/s: 0 rss: 72Mb L: 8/32 MS: 5 ChangeBit-ChangeBit-ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:07:51.842 [2024-11-30 15:44:59.714042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.842 [2024-11-30 15:44:59.714071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 #21 NEW cov: 12457 ft: 14428 corp: 6/75b lim: 35 exec/s: 0 rss: 72Mb L: 8/32 MS: 1 ChangeByte- 00:07:51.842 [2024-11-30 15:44:59.774328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.842 [2024-11-30 15:44:59.774356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 [2024-11-30 15:44:59.774492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.842 [2024-11-30 15:44:59.774509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.103 #22 NEW cov: 12457 ft: 14548 corp: 7/93b lim: 35 exec/s: 0 rss: 72Mb L: 18/32 MS: 1 InsertRepeatedBytes- 00:07:52.103 [2024-11-30 15:44:59.834540] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:44:59.834568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.103 [2024-11-30 15:44:59.834697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:44:59.834717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.103 [2024-11-30 15:44:59.834843] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:44:59.834869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.103 #23 NEW cov: 12457 ft: 14750 corp: 8/114b lim: 35 exec/s: 0 rss: 72Mb L: 21/32 MS: 1 InsertRepeatedBytes- 00:07:52.103 [2024-11-30 15:44:59.894359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:44:59.894386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.103 [2024-11-30 15:44:59.894515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:44:59.894531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.103 #24 NEW cov: 12457 ft: 14774 corp: 9/133b lim: 35 exec/s: 0 rss: 72Mb L: 19/32 MS: 1 InsertByte- 00:07:52.103 [2024-11-30 15:44:59.954383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:44:59.954408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.103 [2024-11-30 15:44:59.954536] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:44:59.954557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.103 #30 NEW cov: 12457 ft: 14866 corp: 10/152b lim: 35 exec/s: 0 rss: 73Mb L: 19/32 MS: 1 ChangeByte- 00:07:52.103 [2024-11-30 15:45:00.024858] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007fa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:45:00.024886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.103 [2024-11-30 15:45:00.025027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007fa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.103 [2024-11-30 15:45:00.025046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.103 NEW_FUNC[1/1]: 0x494398 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:52.103 #32 NEW cov: 12471 ft: 14908 corp: 11/174b lim: 35 exec/s: 0 rss: 73Mb L: 22/32 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:52.363 [2024-11-30 15:45:00.074660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007fa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.363 [2024-11-30 15:45:00.074689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.363 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:52.363 #33 NEW cov: 12494 ft: 15001 corp: 12/191b lim: 35 exec/s: 0 rss: 73Mb L: 17/32 MS: 1 CrossOver- 00:07:52.363 [2024-11-30 15:45:00.145029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.363 [2024-11-30 15:45:00.145059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.363 [2024-11-30 15:45:00.145189] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.363 [2024-11-30 15:45:00.145207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.363 [2024-11-30 15:45:00.145340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.363 [2024-11-30 15:45:00.145357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.363 [2024-11-30 15:45:00.145489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.363 [2024-11-30 15:45:00.145506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.363 #34 NEW cov: 12494 ft: 15065 corp: 13/220b lim: 35 exec/s: 0 rss: 73Mb L: 29/32 MS: 1 CopyPart- 00:07:52.363 [2024-11-30 15:45:00.214637] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.363 [2024-11-30 15:45:00.214666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.363 [2024-11-30 15:45:00.214796] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.363 [2024-11-30 15:45:00.214814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.363 #35 NEW cov: 12494 ft: 15110 corp: 14/238b lim: 35 exec/s: 35 rss: 73Mb L: 18/32 MS: 1 ChangeByte- 00:07:52.364 [2024-11-30 15:45:00.264604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.364 [2024-11-30 15:45:00.264634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.364 [2024-11-30 15:45:00.264768] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.364 [2024-11-30 15:45:00.264786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.364 #36 NEW cov: 12494 ft: 15163 corp: 15/256b lim: 35 exec/s: 36 rss: 73Mb L: 18/32 MS: 1 ShuffleBytes- 00:07:52.623 [2024-11-30 15:45:00.334943] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.623 [2024-11-30 15:45:00.334973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.623 [2024-11-30 15:45:00.335110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.623 [2024-11-30 15:45:00.335128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.623 [2024-11-30 15:45:00.335259] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.623 [2024-11-30 15:45:00.335276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.623 #37 NEW cov: 12494 ft: 15191 corp: 16/277b lim: 35 exec/s: 37 rss: 73Mb L: 21/32 MS: 1 ChangeBinInt- 00:07:52.623 [2024-11-30 15:45:00.385147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.623 [2024-11-30 15:45:00.385177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.623 [2024-11-30 15:45:00.385309] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.623 [2024-11-30 15:45:00.385328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.623 [2024-11-30 15:45:00.385468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.623 [2024-11-30 15:45:00.385484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.623 [2024-11-30 15:45:00.385604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.623 [2024-11-30 15:45:00.385621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.623 #38 NEW cov: 12494 ft: 15209 corp: 17/308b lim: 35 exec/s: 38 rss: 73Mb L: 31/32 MS: 1 CopyPart- 00:07:52.623 [2024-11-30 15:45:00.434683] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.624 [2024-11-30 15:45:00.434711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.624 [2024-11-30 15:45:00.434845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.624 [2024-11-30 15:45:00.434861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.624 #39 NEW cov: 12494 ft: 15228 corp: 18/326b lim: 35 exec/s: 39 rss: 73Mb L: 18/32 MS: 1 ChangeBit- 00:07:52.624 [2024-11-30 15:45:00.484759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.624 [2024-11-30 15:45:00.484788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.624 [2024-11-30 15:45:00.484919] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.624 [2024-11-30 15:45:00.484939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.624 #40 NEW cov: 12494 ft: 15298 corp: 19/344b lim: 35 exec/s: 40 rss: 73Mb L: 18/32 MS: 1 ChangeByte- 00:07:52.624 [2024-11-30 15:45:00.535014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.624 [2024-11-30 15:45:00.535044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.624 [2024-11-30 15:45:00.535173] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.624 [2024-11-30 15:45:00.535193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.624 [2024-11-30 15:45:00.535322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.624 [2024-11-30 15:45:00.535352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.624 #41 NEW cov: 12494 ft: 15331 corp: 20/367b lim: 35 exec/s: 41 rss: 73Mb L: 23/32 MS: 1 CMP- DE: "\000\000"- 00:07:52.624 [2024-11-30 15:45:00.584604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.624 [2024-11-30 15:45:00.584634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.884 #42 NEW cov: 12494 ft: 15348 corp: 21/375b lim: 35 exec/s: 42 rss: 73Mb L: 8/32 MS: 1 ChangeBinInt- 00:07:52.884 [2024-11-30 15:45:00.635177] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.635208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.884 [2024-11-30 15:45:00.635334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.635354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.884 [2024-11-30 15:45:00.635483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.635500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.884 #43 NEW cov: 12494 ft: 15390 corp: 22/396b lim: 35 exec/s: 43 rss: 73Mb L: 21/32 MS: 1 ChangeBit- 00:07:52.884 [2024-11-30 15:45:00.695175] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.695204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.884 [2024-11-30 15:45:00.695342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007df SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.695360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.884 [2024-11-30 15:45:00.695489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.695506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.884 #44 NEW cov: 12494 ft: 15395 corp: 23/419b lim: 35 exec/s: 44 rss: 73Mb L: 23/32 MS: 1 ChangeBit- 00:07:52.884 [2024-11-30 15:45:00.765233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.765266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.884 [2024-11-30 15:45:00.765401] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.765418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.884 [2024-11-30 15:45:00.765545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.765560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.884 #45 NEW cov: 12494 ft: 15422 corp: 24/441b lim: 35 exec/s: 45 rss: 73Mb L: 22/32 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:52.884 [2024-11-30 15:45:00.815142] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.815172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.884 [2024-11-30 15:45:00.815302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.815318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.884 [2024-11-30 15:45:00.815446] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.884 [2024-11-30 15:45:00.815463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.144 #46 NEW cov: 12494 ft: 15431 corp: 25/464b lim: 35 exec/s: 46 rss: 73Mb L: 23/32 MS: 1 InsertByte- 00:07:53.144 [2024-11-30 15:45:00.885302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:00.885331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:00.885476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:00.885494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:00.885625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:00.885644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.144 #47 NEW cov: 12494 ft: 15432 corp: 26/487b lim: 35 exec/s: 47 rss: 74Mb L: 23/32 MS: 1 ChangeBinInt- 00:07:53.144 [2024-11-30 15:45:00.955239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:00.955269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:00.955405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:00.955422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:00.955557] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007bf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:00.955574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.144 #48 NEW cov: 12494 ft: 15466 corp: 27/510b lim: 35 exec/s: 48 rss: 74Mb L: 23/32 MS: 1 ChangeBit- 00:07:53.144 [2024-11-30 15:45:01.025528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:01.025557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:01.025701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:01.025717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:01.025849] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:01.025866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:01.025995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:01.026012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.144 #49 NEW cov: 12494 ft: 15475 corp: 28/543b lim: 35 exec/s: 49 rss: 74Mb L: 33/33 MS: 1 CrossOver- 00:07:53.144 [2024-11-30 15:45:01.064819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:01.064846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.144 #50 NEW cov: 12494 ft: 15496 corp: 29/551b lim: 35 exec/s: 50 rss: 74Mb L: 8/33 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:53.144 [2024-11-30 15:45:01.105353] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:01.105381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:01.105511] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:01.105531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.144 [2024-11-30 15:45:01.105666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.144 [2024-11-30 15:45:01.105683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.404 #51 NEW cov: 12494 ft: 15503 corp: 30/575b lim: 35 exec/s: 51 rss: 74Mb L: 24/33 MS: 1 EraseBytes- 00:07:53.404 [2024-11-30 15:45:01.175286] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007fa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.404 [2024-11-30 15:45:01.175315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.404 #52 NEW cov: 12494 ft: 15521 corp: 31/592b lim: 35 exec/s: 26 rss: 74Mb L: 17/33 MS: 1 CMP- DE: "\306\003\000\000\000\000\000\000"- 00:07:53.404 #52 DONE cov: 12494 ft: 15521 corp: 31/592b lim: 35 exec/s: 26 rss: 74Mb 00:07:53.404 ###### Recommended dictionary. ###### 00:07:53.404 "\000\000\000\000" # Uses: 2 00:07:53.404 "\000\000" # Uses: 0 00:07:53.404 "\306\003\000\000\000\000\000\000" # Uses: 0 00:07:53.404 ###### End of recommended dictionary. ###### 00:07:53.404 Done 52 runs in 2 second(s) 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:53.404 15:45:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:07:53.404 [2024-11-30 15:45:01.358750] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:53.404 [2024-11-30 15:45:01.358818] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1717913 ] 00:07:53.663 [2024-11-30 15:45:01.600221] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:53.921 [2024-11-30 15:45:01.647723] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.921 [2024-11-30 15:45:01.661893] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.921 [2024-11-30 15:45:01.714428] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.921 [2024-11-30 15:45:01.730774] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:53.921 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.921 INFO: Seed: 1294277221 00:07:53.921 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:53.921 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:53.922 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:53.922 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.922 #2 INITED exec/s: 0 rss: 64Mb 00:07:53.922 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.922 This may also happen if the target rejected all inputs we tried so far 00:07:53.922 [2024-11-30 15:45:01.799681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.922 [2024-11-30 15:45:01.799720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.922 [2024-11-30 15:45:01.799839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.922 [2024-11-30 15:45:01.799870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.180 NEW_FUNC[1/717]: 0x475848 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:54.180 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.180 #6 NEW cov: 12358 ft: 12359 corp: 2/47b lim: 105 exec/s: 0 rss: 72Mb L: 46/46 MS: 4 ShuffleBytes-ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:54.439 [2024-11-30 15:45:02.149995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.150066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.439 #12 NEW cov: 12471 ft: 13317 corp: 3/87b lim: 105 exec/s: 0 rss: 72Mb L: 40/46 MS: 1 EraseBytes- 00:07:54.439 [2024-11-30 15:45:02.230058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.230098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.439 [2024-11-30 15:45:02.230225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.230252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.439 [2024-11-30 15:45:02.230376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.230402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.439 #18 NEW cov: 12477 ft: 13924 corp: 4/169b lim: 105 exec/s: 0 rss: 72Mb L: 82/82 MS: 1 CrossOver- 00:07:54.439 [2024-11-30 15:45:02.280303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:10055284024492657547 len:35724 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.280342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.439 [2024-11-30 15:45:02.280416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10055284024492657547 len:35724 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.280441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.439 [2024-11-30 15:45:02.280563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:10055284024492657547 len:35724 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.280587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.439 [2024-11-30 15:45:02.280711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:10055284024492657547 len:35724 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.280734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.439 #29 NEW cov: 12562 ft: 14699 corp: 5/264b lim: 105 exec/s: 0 rss: 72Mb L: 95/95 MS: 1 InsertRepeatedBytes- 00:07:54.439 [2024-11-30 15:45:02.339784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.439 [2024-11-30 15:45:02.339817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.439 #30 NEW cov: 12562 ft: 14779 corp: 6/285b lim: 105 exec/s: 0 rss: 72Mb L: 21/95 MS: 1 EraseBytes- 00:07:54.699 [2024-11-30 15:45:02.410043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.699 [2024-11-30 15:45:02.410070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.699 [2024-11-30 15:45:02.410208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.699 [2024-11-30 15:45:02.410233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.699 #31 NEW cov: 12562 ft: 14884 corp: 7/331b lim: 105 exec/s: 0 rss: 72Mb L: 46/95 MS: 1 ChangeBinInt- 00:07:54.699 [2024-11-30 15:45:02.460082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.699 [2024-11-30 15:45:02.460116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.699 [2024-11-30 15:45:02.460242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.699 [2024-11-30 15:45:02.460262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.699 #32 NEW cov: 12562 ft: 14963 corp: 8/377b lim: 105 exec/s: 0 rss: 72Mb L: 46/95 MS: 1 ShuffleBytes- 00:07:54.699 [2024-11-30 15:45:02.510432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.699 [2024-11-30 15:45:02.510468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.700 [2024-11-30 15:45:02.510559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.700 [2024-11-30 15:45:02.510582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.700 [2024-11-30 15:45:02.510718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.700 [2024-11-30 15:45:02.510740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.700 [2024-11-30 15:45:02.510870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:3386706919782612991 len:65291 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.700 [2024-11-30 15:45:02.510894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.700 #33 NEW cov: 12562 ft: 14996 corp: 9/463b lim: 105 exec/s: 0 rss: 72Mb L: 86/95 MS: 1 CrossOver- 00:07:54.700 [2024-11-30 15:45:02.559859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.700 [2024-11-30 15:45:02.559885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.700 #34 NEW cov: 12562 ft: 15062 corp: 10/485b lim: 105 exec/s: 0 rss: 72Mb L: 22/95 MS: 1 CrossOver- 00:07:54.700 [2024-11-30 15:45:02.630358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.700 [2024-11-30 15:45:02.630393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.700 [2024-11-30 15:45:02.630505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.700 [2024-11-30 15:45:02.630528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.700 [2024-11-30 15:45:02.630652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.700 [2024-11-30 15:45:02.630675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.959 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:54.959 #35 NEW cov: 12585 ft: 15107 corp: 11/567b lim: 105 exec/s: 0 rss: 72Mb L: 82/95 MS: 1 ChangeByte- 00:07:54.959 [2024-11-30 15:45:02.700250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18374962457090195455 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.959 [2024-11-30 15:45:02.700276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.959 [2024-11-30 15:45:02.700411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.959 [2024-11-30 15:45:02.700433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.959 #41 NEW cov: 12585 ft: 15187 corp: 12/613b lim: 105 exec/s: 0 rss: 72Mb L: 46/95 MS: 1 ChangeBinInt- 00:07:54.959 [2024-11-30 15:45:02.770238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.959 [2024-11-30 15:45:02.770271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.959 [2024-11-30 15:45:02.770414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.959 [2024-11-30 15:45:02.770436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.959 #42 NEW cov: 12585 ft: 15217 corp: 13/672b lim: 105 exec/s: 42 rss: 73Mb L: 59/95 MS: 1 EraseBytes- 00:07:54.959 [2024-11-30 15:45:02.840462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.959 [2024-11-30 15:45:02.840499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.959 [2024-11-30 15:45:02.840595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.959 [2024-11-30 15:45:02.840625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.959 [2024-11-30 15:45:02.840747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.959 [2024-11-30 15:45:02.840767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.959 #48 NEW cov: 12585 ft: 15251 corp: 14/738b lim: 105 exec/s: 48 rss: 73Mb L: 66/95 MS: 1 CrossOver- 00:07:54.959 [2024-11-30 15:45:02.890085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.959 [2024-11-30 15:45:02.890112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.959 #49 NEW cov: 12585 ft: 15296 corp: 15/759b lim: 105 exec/s: 49 rss: 73Mb L: 21/95 MS: 1 ShuffleBytes- 00:07:55.219 [2024-11-30 15:45:02.941024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18374962457090195455 len:65442 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-30 15:45:02.941060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.219 [2024-11-30 15:45:02.941152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11646767826930344353 len:41378 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-30 15:45:02.941174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.219 [2024-11-30 15:45:02.941298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11646767826930344353 len:41378 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-30 15:45:02.941321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.219 [2024-11-30 15:45:02.941438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:11646768230657270177 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-30 15:45:02.941462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.219 [2024-11-30 15:45:02.941585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-30 15:45:02.941603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:55.219 #50 NEW cov: 12585 ft: 15368 corp: 16/864b lim: 105 exec/s: 50 rss: 73Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:07:55.219 [2024-11-30 15:45:03.010304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-30 15:45:03.010331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.219 #51 NEW cov: 12585 ft: 15387 corp: 17/904b lim: 105 exec/s: 51 rss: 73Mb L: 40/105 MS: 1 ChangeBit- 00:07:55.219 [2024-11-30 15:45:03.061091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18374962457090195455 len:65442 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-30 15:45:03.061124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.219 [2024-11-30 15:45:03.061219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11646767826930344353 len:41386 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-30 15:45:03.061243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.219 [2024-11-30 15:45:03.061354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11646767826930344353 len:41378 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-30 15:45:03.061376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.220 [2024-11-30 15:45:03.061495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:11646768230657270177 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-30 15:45:03.061519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.220 [2024-11-30 15:45:03.061644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-30 15:45:03.061665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:55.220 #52 NEW cov: 12585 ft: 15415 corp: 18/1009b lim: 105 exec/s: 52 rss: 73Mb L: 105/105 MS: 1 ChangeBit- 00:07:55.220 [2024-11-30 15:45:03.130536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17870283321406128127 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-30 15:45:03.130566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.220 [2024-11-30 15:45:03.130703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-30 15:45:03.130724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.220 #53 NEW cov: 12585 ft: 15445 corp: 19/1055b lim: 105 exec/s: 53 rss: 73Mb L: 46/105 MS: 1 ChangeBit- 00:07:55.220 [2024-11-30 15:45:03.180350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-30 15:45:03.180385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.479 #54 NEW cov: 12585 ft: 15462 corp: 20/1095b lim: 105 exec/s: 54 rss: 73Mb L: 40/105 MS: 1 ChangeBit- 00:07:55.479 [2024-11-30 15:45:03.250474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-30 15:45:03.250501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.479 #55 NEW cov: 12585 ft: 15481 corp: 21/1116b lim: 105 exec/s: 55 rss: 73Mb L: 21/105 MS: 1 ShuffleBytes- 00:07:55.479 [2024-11-30 15:45:03.320655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-30 15:45:03.320688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.479 [2024-11-30 15:45:03.320806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446464797756096511 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-30 15:45:03.320839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.479 #56 NEW cov: 12585 ft: 15504 corp: 22/1162b lim: 105 exec/s: 56 rss: 73Mb L: 46/105 MS: 1 ChangeBinInt- 00:07:55.479 [2024-11-30 15:45:03.370521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-30 15:45:03.370551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.479 #57 NEW cov: 12585 ft: 15574 corp: 23/1202b lim: 105 exec/s: 57 rss: 73Mb L: 40/105 MS: 1 ShuffleBytes- 00:07:55.479 [2024-11-30 15:45:03.420635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:63488 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-30 15:45:03.420672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.738 #58 NEW cov: 12585 ft: 15583 corp: 24/1242b lim: 105 exec/s: 58 rss: 73Mb L: 40/105 MS: 1 ChangeBit- 00:07:55.739 [2024-11-30 15:45:03.491349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-11-30 15:45:03.491387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.739 [2024-11-30 15:45:03.491473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65448 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-11-30 15:45:03.491496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.739 [2024-11-30 15:45:03.491625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-11-30 15:45:03.491655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.739 [2024-11-30 15:45:03.491787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:12080808863958804391 len:42920 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-11-30 15:45:03.491813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.739 #59 NEW cov: 12585 ft: 15590 corp: 25/1339b lim: 105 exec/s: 59 rss: 73Mb L: 97/105 MS: 1 InsertRepeatedBytes- 00:07:55.739 [2024-11-30 15:45:03.540676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-11-30 15:45:03.540713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.739 #60 NEW cov: 12585 ft: 15670 corp: 26/1379b lim: 105 exec/s: 60 rss: 73Mb L: 40/105 MS: 1 ChangeBinInt- 00:07:55.739 [2024-11-30 15:45:03.590820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-11-30 15:45:03.590851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.739 #61 NEW cov: 12585 ft: 15699 corp: 27/1419b lim: 105 exec/s: 61 rss: 73Mb L: 40/105 MS: 1 ChangeByte- 00:07:55.739 [2024-11-30 15:45:03.640807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:63488 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.739 [2024-11-30 15:45:03.640840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.739 #62 NEW cov: 12585 ft: 15737 corp: 28/1459b lim: 105 exec/s: 62 rss: 73Mb L: 40/105 MS: 1 ChangeBinInt- 00:07:55.999 [2024-11-30 15:45:03.711187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-11-30 15:45:03.711225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.999 [2024-11-30 15:45:03.711319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-11-30 15:45:03.711343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.999 [2024-11-30 15:45:03.711463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-11-30 15:45:03.711487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:55.999 #63 NEW cov: 12585 ft: 15759 corp: 29/1541b lim: 105 exec/s: 63 rss: 73Mb L: 82/105 MS: 1 ChangeByte- 00:07:55.999 [2024-11-30 15:45:03.760829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.999 [2024-11-30 15:45:03.760858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.999 #64 pulse cov: 12585 ft: 15773 corp: 29/1541b lim: 105 exec/s: 32 rss: 73Mb 00:07:55.999 #64 NEW cov: 12585 ft: 15773 corp: 30/1581b lim: 105 exec/s: 32 rss: 73Mb L: 40/105 MS: 1 CopyPart- 00:07:55.999 #64 DONE cov: 12585 ft: 15773 corp: 30/1581b lim: 105 exec/s: 32 rss: 73Mb 00:07:55.999 Done 64 runs in 2 second(s) 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:55.999 15:45:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:07:55.999 [2024-11-30 15:45:03.941688] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:55.999 [2024-11-30 15:45:03.941773] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1718583 ] 00:07:56.259 [2024-11-30 15:45:04.186043] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:56.519 [2024-11-30 15:45:04.233375] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.519 [2024-11-30 15:45:04.247397] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.519 [2024-11-30 15:45:04.300073] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.519 [2024-11-30 15:45:04.316377] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:56.519 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.519 INFO: Seed: 3880278746 00:07:56.519 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:56.519 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:56.519 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:56.519 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.520 #2 INITED exec/s: 0 rss: 65Mb 00:07:56.520 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.520 This may also happen if the target rejected all inputs we tried so far 00:07:56.520 [2024-11-30 15:45:04.376576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.520 [2024-11-30 15:45:04.376622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.520 [2024-11-30 15:45:04.376765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.520 [2024-11-30 15:45:04.376790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.520 [2024-11-30 15:45:04.376930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.520 [2024-11-30 15:45:04.376954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.779 NEW_FUNC[1/717]: 0x478bc8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:56.779 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:56.779 #9 NEW cov: 12377 ft: 12380 corp: 2/76b lim: 120 exec/s: 0 rss: 72Mb L: 75/75 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:56.779 [2024-11-30 15:45:04.726821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.779 [2024-11-30 15:45:04.726868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.779 [2024-11-30 15:45:04.726996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.779 [2024-11-30 15:45:04.727019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.779 [2024-11-30 15:45:04.727149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.779 [2024-11-30 15:45:04.727169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.039 NEW_FUNC[1/1]: 0x1fbdd28 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:820 00:07:57.039 #10 NEW cov: 12492 ft: 13019 corp: 3/151b lim: 120 exec/s: 0 rss: 72Mb L: 75/75 MS: 1 ShuffleBytes- 00:07:57.039 [2024-11-30 15:45:04.806728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.806769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.039 [2024-11-30 15:45:04.806905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.806927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.039 [2024-11-30 15:45:04.807056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.807077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.039 #11 NEW cov: 12498 ft: 13294 corp: 4/226b lim: 120 exec/s: 0 rss: 72Mb L: 75/75 MS: 1 CrossOver- 00:07:57.039 [2024-11-30 15:45:04.876837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.876874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.039 [2024-11-30 15:45:04.877004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.877027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.039 [2024-11-30 15:45:04.877156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.877180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.039 #17 NEW cov: 12583 ft: 13560 corp: 5/301b lim: 120 exec/s: 0 rss: 73Mb L: 75/75 MS: 1 ChangeBit- 00:07:57.039 [2024-11-30 15:45:04.946576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.946614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.039 [2024-11-30 15:45:04.946747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.946769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.039 #18 NEW cov: 12583 ft: 14056 corp: 6/358b lim: 120 exec/s: 0 rss: 73Mb L: 57/75 MS: 1 EraseBytes- 00:07:57.039 [2024-11-30 15:45:04.996898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.996933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.039 [2024-11-30 15:45:04.997043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.997062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.039 [2024-11-30 15:45:04.997193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.039 [2024-11-30 15:45:04.997231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.299 #19 NEW cov: 12583 ft: 14191 corp: 7/435b lim: 120 exec/s: 0 rss: 73Mb L: 77/77 MS: 1 CrossOver- 00:07:57.299 [2024-11-30 15:45:05.066919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.066953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.299 [2024-11-30 15:45:05.067056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.067080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.299 [2024-11-30 15:45:05.067208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.067233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.299 #25 NEW cov: 12583 ft: 14322 corp: 8/510b lim: 120 exec/s: 0 rss: 73Mb L: 75/77 MS: 1 CMP- DE: "\000\000\000\000\000\000\000H"- 00:07:57.299 [2024-11-30 15:45:05.116431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51211 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.116457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.299 #26 NEW cov: 12583 ft: 15132 corp: 9/556b lim: 120 exec/s: 0 rss: 73Mb L: 46/77 MS: 1 EraseBytes- 00:07:57.299 [2024-11-30 15:45:05.166976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.167012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.299 [2024-11-30 15:45:05.167138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.167163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.299 [2024-11-30 15:45:05.167283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.167307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.299 #27 NEW cov: 12583 ft: 15166 corp: 10/631b lim: 120 exec/s: 0 rss: 73Mb L: 75/77 MS: 1 ChangeByte- 00:07:57.299 [2024-11-30 15:45:05.217124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.217158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.299 [2024-11-30 15:45:05.217291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.217313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.299 [2024-11-30 15:45:05.217438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.299 [2024-11-30 15:45:05.217463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.299 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:57.299 #28 NEW cov: 12606 ft: 15258 corp: 11/706b lim: 120 exec/s: 0 rss: 73Mb L: 75/77 MS: 1 ShuffleBytes- 00:07:57.558 [2024-11-30 15:45:05.287074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.558 [2024-11-30 15:45:05.287112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.558 [2024-11-30 15:45:05.287231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.558 [2024-11-30 15:45:05.287254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.558 [2024-11-30 15:45:05.287391] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.558 [2024-11-30 15:45:05.287416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.558 #29 NEW cov: 12606 ft: 15295 corp: 12/781b lim: 120 exec/s: 0 rss: 73Mb L: 75/77 MS: 1 ShuffleBytes- 00:07:57.558 [2024-11-30 15:45:05.337163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.558 [2024-11-30 15:45:05.337203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.558 [2024-11-30 15:45:05.337326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.558 [2024-11-30 15:45:05.337355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.558 [2024-11-30 15:45:05.337492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.558 [2024-11-30 15:45:05.337516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.559 #30 NEW cov: 12606 ft: 15300 corp: 13/856b lim: 120 exec/s: 30 rss: 73Mb L: 75/77 MS: 1 ShuffleBytes- 00:07:57.559 [2024-11-30 15:45:05.387132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034146708539592 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.559 [2024-11-30 15:45:05.387169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.559 [2024-11-30 15:45:05.387281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.559 [2024-11-30 15:45:05.387306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.559 [2024-11-30 15:45:05.387447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.559 [2024-11-30 15:45:05.387473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.559 #31 NEW cov: 12606 ft: 15333 corp: 14/931b lim: 120 exec/s: 31 rss: 73Mb L: 75/77 MS: 1 ChangeByte- 00:07:57.559 [2024-11-30 15:45:05.457194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51456 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.559 [2024-11-30 15:45:05.457227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.559 [2024-11-30 15:45:05.457340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034568541685960 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.559 [2024-11-30 15:45:05.457365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.559 [2024-11-30 15:45:05.457497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468033751571548360 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.559 [2024-11-30 15:45:05.457522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.559 #32 NEW cov: 12606 ft: 15348 corp: 15/1021b lim: 120 exec/s: 32 rss: 73Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:07:57.818 [2024-11-30 15:45:05.527288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.527325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.527437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034584795203784 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.527461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.527585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14467825660406057160 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.527618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.818 #33 NEW cov: 12606 ft: 15382 corp: 16/1097b lim: 120 exec/s: 33 rss: 73Mb L: 76/90 MS: 1 CrossOver- 00:07:57.818 [2024-11-30 15:45:05.577230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034593385138376 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.577263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.577387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.577410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.577545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.577567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.818 #34 NEW cov: 12606 ft: 15390 corp: 17/1172b lim: 120 exec/s: 34 rss: 73Mb L: 75/90 MS: 1 ChangeBinInt- 00:07:57.818 [2024-11-30 15:45:05.627029] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.627066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.627202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334464 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.627224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.818 #35 NEW cov: 12606 ft: 15425 corp: 18/1230b lim: 120 exec/s: 35 rss: 73Mb L: 58/90 MS: 1 InsertByte- 00:07:57.818 [2024-11-30 15:45:05.677118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.677152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.677279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.677301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.818 #36 NEW cov: 12606 ft: 15484 corp: 19/1285b lim: 120 exec/s: 36 rss: 73Mb L: 55/90 MS: 1 EraseBytes- 00:07:57.818 [2024-11-30 15:45:05.727765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.727799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.727902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034584795203784 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.727925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.728051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.728069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.818 [2024-11-30 15:45:05.728201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.818 [2024-11-30 15:45:05.728226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.818 #37 NEW cov: 12606 ft: 15836 corp: 20/1400b lim: 120 exec/s: 37 rss: 73Mb L: 115/115 MS: 1 CopyPart- 00:07:58.078 [2024-11-30 15:45:05.797693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51456 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.078 [2024-11-30 15:45:05.797729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.078 [2024-11-30 15:45:05.797818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034568541685960 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.078 [2024-11-30 15:45:05.797839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.078 [2024-11-30 15:45:05.797953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12442509728149187756 len:44233 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.078 [2024-11-30 15:45:05.797974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.078 [2024-11-30 15:45:05.798098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14468034564427663560 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.078 [2024-11-30 15:45:05.798118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.078 #38 NEW cov: 12606 ft: 15861 corp: 21/1507b lim: 120 exec/s: 38 rss: 73Mb L: 107/115 MS: 1 InsertRepeatedBytes- 00:07:58.078 [2024-11-30 15:45:05.866962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.078 [2024-11-30 15:45:05.866989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.078 #39 NEW cov: 12606 ft: 15883 corp: 22/1550b lim: 120 exec/s: 39 rss: 73Mb L: 43/115 MS: 1 EraseBytes- 00:07:58.078 [2024-11-30 15:45:05.917317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034593385138376 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.078 [2024-11-30 15:45:05.917351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.078 [2024-11-30 15:45:05.917470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.078 [2024-11-30 15:45:05.917493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.078 #40 NEW cov: 12606 ft: 15915 corp: 23/1608b lim: 120 exec/s: 40 rss: 73Mb L: 58/115 MS: 1 EraseBytes- 00:07:58.078 [2024-11-30 15:45:05.987002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51211 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.078 [2024-11-30 15:45:05.987029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.078 #41 NEW cov: 12606 ft: 15923 corp: 24/1649b lim: 120 exec/s: 41 rss: 74Mb L: 41/115 MS: 1 EraseBytes- 00:07:58.338 [2024-11-30 15:45:06.057590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.057627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.338 [2024-11-30 15:45:06.057714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.057738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.338 [2024-11-30 15:45:06.057864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.057886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.338 #42 NEW cov: 12606 ft: 15989 corp: 25/1724b lim: 120 exec/s: 42 rss: 74Mb L: 75/115 MS: 1 CMP- DE: "\001\000\377\377"- 00:07:58.338 [2024-11-30 15:45:06.106962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.106996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.338 #43 NEW cov: 12606 ft: 16028 corp: 26/1767b lim: 120 exec/s: 43 rss: 74Mb L: 43/115 MS: 1 CMP- DE: "\001\000"- 00:07:58.338 [2024-11-30 15:45:06.177710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.177745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.338 [2024-11-30 15:45:06.177831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.177852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.338 [2024-11-30 15:45:06.177985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.178008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.338 #44 NEW cov: 12606 ft: 16100 corp: 27/1842b lim: 120 exec/s: 44 rss: 74Mb L: 75/115 MS: 1 ShuffleBytes- 00:07:58.338 [2024-11-30 15:45:06.228016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.228047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.338 [2024-11-30 15:45:06.228135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.228155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.338 [2024-11-30 15:45:06.228283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:72057356888360961 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.228304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.338 [2024-11-30 15:45:06.228429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.228448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.338 #45 NEW cov: 12606 ft: 16130 corp: 28/1954b lim: 120 exec/s: 45 rss: 74Mb L: 112/115 MS: 1 CrossOver- 00:07:58.338 [2024-11-30 15:45:06.297176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468033712916842696 len:65481 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.338 [2024-11-30 15:45:06.297202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.598 #46 NEW cov: 12606 ft: 16164 corp: 29/2001b lim: 120 exec/s: 46 rss: 74Mb L: 47/115 MS: 1 PersAutoDict- DE: "\001\000\377\377"- 00:07:58.598 [2024-11-30 15:45:06.367572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.598 [2024-11-30 15:45:06.367608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.598 [2024-11-30 15:45:06.367736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14468034567615334600 len:51401 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.598 [2024-11-30 15:45:06.367760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.598 #47 NEW cov: 12606 ft: 16185 corp: 30/2059b lim: 120 exec/s: 23 rss: 74Mb L: 58/115 MS: 1 CopyPart- 00:07:58.598 #47 DONE cov: 12606 ft: 16185 corp: 30/2059b lim: 120 exec/s: 23 rss: 74Mb 00:07:58.598 ###### Recommended dictionary. ###### 00:07:58.598 "\000\000\000\000\000\000\000H" # Uses: 0 00:07:58.598 "\001\000\377\377" # Uses: 1 00:07:58.598 "\001\000" # Uses: 0 00:07:58.598 ###### End of recommended dictionary. ###### 00:07:58.598 Done 47 runs in 2 second(s) 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:58.598 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:58.599 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.599 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:58.599 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:58.599 15:45:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:07:58.599 [2024-11-30 15:45:06.544679] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:58.599 [2024-11-30 15:45:06.544740] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1719267 ] 00:07:59.166 [2024-11-30 15:45:06.863806] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:59.166 [2024-11-30 15:45:06.911003] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.166 [2024-11-30 15:45:06.931742] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.166 [2024-11-30 15:45:06.984148] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.166 [2024-11-30 15:45:07.000454] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:59.166 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.166 INFO: Seed: 2267997757 00:07:59.166 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:59.166 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:59.166 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:59.166 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.166 #2 INITED exec/s: 0 rss: 64Mb 00:07:59.166 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.166 This may also happen if the target rejected all inputs we tried so far 00:07:59.166 [2024-11-30 15:45:07.047242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.166 [2024-11-30 15:45:07.047271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.166 [2024-11-30 15:45:07.047326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.166 [2024-11-30 15:45:07.047340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.166 [2024-11-30 15:45:07.047396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.166 [2024-11-30 15:45:07.047412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.425 NEW_FUNC[1/715]: 0x47c4b8 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:59.425 NEW_FUNC[2/715]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:59.425 #22 NEW cov: 12315 ft: 12313 corp: 2/67b lim: 100 exec/s: 0 rss: 72Mb L: 66/66 MS: 5 CopyPart-CopyPart-InsertRepeatedBytes-InsertByte-InsertRepeatedBytes- 00:07:59.425 [2024-11-30 15:45:07.377155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.425 [2024-11-30 15:45:07.377187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.425 [2024-11-30 15:45:07.377219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.425 [2024-11-30 15:45:07.377234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.425 [2024-11-30 15:45:07.377285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.425 [2024-11-30 15:45:07.377298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.684 NEW_FUNC[1/1]: 0x1a68888 in nvme_tcp_ctrlr_connect_qpair_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:2299 00:07:59.684 #23 NEW cov: 12435 ft: 13017 corp: 3/133b lim: 100 exec/s: 0 rss: 72Mb L: 66/66 MS: 1 ShuffleBytes- 00:07:59.684 [2024-11-30 15:45:07.446902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.684 [2024-11-30 15:45:07.446930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.684 #29 NEW cov: 12441 ft: 13626 corp: 4/171b lim: 100 exec/s: 0 rss: 72Mb L: 38/66 MS: 1 InsertRepeatedBytes- 00:07:59.684 [2024-11-30 15:45:07.486999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.684 [2024-11-30 15:45:07.487025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.487074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.684 [2024-11-30 15:45:07.487092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.684 #30 NEW cov: 12526 ft: 14148 corp: 5/220b lim: 100 exec/s: 0 rss: 73Mb L: 49/66 MS: 1 CopyPart- 00:07:59.684 [2024-11-30 15:45:07.547239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.684 [2024-11-30 15:45:07.547265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.547309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.684 [2024-11-30 15:45:07.547324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.547376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.684 [2024-11-30 15:45:07.547389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.547439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:59.684 [2024-11-30 15:45:07.547453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.684 #33 NEW cov: 12526 ft: 14499 corp: 6/314b lim: 100 exec/s: 0 rss: 73Mb L: 94/94 MS: 3 ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:07:59.684 [2024-11-30 15:45:07.587272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.684 [2024-11-30 15:45:07.587298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.587342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.684 [2024-11-30 15:45:07.587356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.587408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.684 [2024-11-30 15:45:07.587422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.587472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:59.684 [2024-11-30 15:45:07.587486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.684 #34 NEW cov: 12526 ft: 14622 corp: 7/408b lim: 100 exec/s: 0 rss: 73Mb L: 94/94 MS: 1 ShuffleBytes- 00:07:59.684 [2024-11-30 15:45:07.647183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.684 [2024-11-30 15:45:07.647209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.647243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.684 [2024-11-30 15:45:07.647257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.684 [2024-11-30 15:45:07.647307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.684 [2024-11-30 15:45:07.647321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.943 #35 NEW cov: 12526 ft: 14714 corp: 8/474b lim: 100 exec/s: 0 rss: 73Mb L: 66/94 MS: 1 ChangeByte- 00:07:59.943 [2024-11-30 15:45:07.687280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.943 [2024-11-30 15:45:07.687306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.687350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.943 [2024-11-30 15:45:07.687366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.687418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.943 [2024-11-30 15:45:07.687432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.687482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:59.943 [2024-11-30 15:45:07.687495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.943 #36 NEW cov: 12526 ft: 14787 corp: 9/568b lim: 100 exec/s: 0 rss: 73Mb L: 94/94 MS: 1 CMP- DE: "\000\224\306\3114\312 \350"- 00:07:59.943 [2024-11-30 15:45:07.727316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.943 [2024-11-30 15:45:07.727341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.727394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.943 [2024-11-30 15:45:07.727407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.727456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.943 [2024-11-30 15:45:07.727469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.727519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:59.943 [2024-11-30 15:45:07.727533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.943 #37 NEW cov: 12526 ft: 14828 corp: 10/664b lim: 100 exec/s: 0 rss: 73Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:07:59.943 [2024-11-30 15:45:07.767369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.943 [2024-11-30 15:45:07.767395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.767440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.943 [2024-11-30 15:45:07.767454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.767506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.943 [2024-11-30 15:45:07.767520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.767571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:59.943 [2024-11-30 15:45:07.767584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.943 #38 NEW cov: 12526 ft: 14890 corp: 11/745b lim: 100 exec/s: 0 rss: 73Mb L: 81/96 MS: 1 InsertRepeatedBytes- 00:07:59.943 [2024-11-30 15:45:07.807405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.943 [2024-11-30 15:45:07.807431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.807473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.943 [2024-11-30 15:45:07.807487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.807537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.943 [2024-11-30 15:45:07.807553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.807605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:59.943 [2024-11-30 15:45:07.807620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.943 #39 NEW cov: 12526 ft: 14906 corp: 12/839b lim: 100 exec/s: 0 rss: 73Mb L: 94/96 MS: 1 ChangeBinInt- 00:07:59.943 [2024-11-30 15:45:07.847407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.943 [2024-11-30 15:45:07.847432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.847477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.943 [2024-11-30 15:45:07.847490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.847541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.943 [2024-11-30 15:45:07.847555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.847607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:59.943 [2024-11-30 15:45:07.847621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.943 #40 NEW cov: 12526 ft: 14936 corp: 13/936b lim: 100 exec/s: 0 rss: 73Mb L: 97/97 MS: 1 InsertByte- 00:07:59.943 [2024-11-30 15:45:07.907357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:59.943 [2024-11-30 15:45:07.907384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.907429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:59.943 [2024-11-30 15:45:07.907444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.943 [2024-11-30 15:45:07.907494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:59.943 [2024-11-30 15:45:07.907508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.202 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:00.202 #41 NEW cov: 12549 ft: 14998 corp: 14/1002b lim: 100 exec/s: 0 rss: 73Mb L: 66/97 MS: 1 ChangeBinInt- 00:08:00.202 [2024-11-30 15:45:07.967553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.202 [2024-11-30 15:45:07.967580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:07.967630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.202 [2024-11-30 15:45:07.967645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:07.967697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.202 [2024-11-30 15:45:07.967711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:07.967761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.202 [2024-11-30 15:45:07.967775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.202 #42 NEW cov: 12549 ft: 15015 corp: 15/1091b lim: 100 exec/s: 0 rss: 73Mb L: 89/97 MS: 1 InsertRepeatedBytes- 00:08:00.202 [2024-11-30 15:45:08.007138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.202 [2024-11-30 15:45:08.007165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.202 #43 NEW cov: 12549 ft: 15030 corp: 16/1129b lim: 100 exec/s: 0 rss: 73Mb L: 38/97 MS: 1 CopyPart- 00:08:00.202 [2024-11-30 15:45:08.047494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.202 [2024-11-30 15:45:08.047519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:08.047578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.202 [2024-11-30 15:45:08.047593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:08.047649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.202 [2024-11-30 15:45:08.047663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:08.047713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.202 [2024-11-30 15:45:08.047727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.202 #44 NEW cov: 12549 ft: 15045 corp: 17/1210b lim: 100 exec/s: 44 rss: 73Mb L: 81/97 MS: 1 ChangeBinInt- 00:08:00.202 [2024-11-30 15:45:08.107565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.202 [2024-11-30 15:45:08.107591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:08.107644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.202 [2024-11-30 15:45:08.107658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:08.107707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.202 [2024-11-30 15:45:08.107722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.202 [2024-11-30 15:45:08.107774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.202 [2024-11-30 15:45:08.107789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.202 #45 NEW cov: 12549 ft: 15099 corp: 18/1299b lim: 100 exec/s: 45 rss: 73Mb L: 89/97 MS: 1 CMP- DE: "\000\224\306\311s2\037\246"- 00:08:00.467 [2024-11-30 15:45:08.167354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.467 [2024-11-30 15:45:08.167381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.167426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.467 [2024-11-30 15:45:08.167440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.467 #46 NEW cov: 12549 ft: 15125 corp: 19/1356b lim: 100 exec/s: 46 rss: 73Mb L: 57/97 MS: 1 EraseBytes- 00:08:00.467 [2024-11-30 15:45:08.227580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.467 [2024-11-30 15:45:08.227611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.227664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.467 [2024-11-30 15:45:08.227681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.227731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.467 [2024-11-30 15:45:08.227745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.227795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.467 [2024-11-30 15:45:08.227809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.467 #47 NEW cov: 12549 ft: 15144 corp: 20/1437b lim: 100 exec/s: 47 rss: 73Mb L: 81/97 MS: 1 ChangeByte- 00:08:00.467 [2024-11-30 15:45:08.267500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.467 [2024-11-30 15:45:08.267526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.267563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.467 [2024-11-30 15:45:08.267576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.267629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.467 [2024-11-30 15:45:08.267644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.467 #48 NEW cov: 12549 ft: 15211 corp: 21/1507b lim: 100 exec/s: 48 rss: 73Mb L: 70/97 MS: 1 EraseBytes- 00:08:00.467 [2024-11-30 15:45:08.327628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.467 [2024-11-30 15:45:08.327654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.327705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.467 [2024-11-30 15:45:08.327719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.327770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.467 [2024-11-30 15:45:08.327784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.327834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.467 [2024-11-30 15:45:08.327847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.467 #49 NEW cov: 12549 ft: 15218 corp: 22/1606b lim: 100 exec/s: 49 rss: 73Mb L: 99/99 MS: 1 CrossOver- 00:08:00.467 [2024-11-30 15:45:08.387558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.467 [2024-11-30 15:45:08.387585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.387646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.467 [2024-11-30 15:45:08.387661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.467 [2024-11-30 15:45:08.387712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.467 [2024-11-30 15:45:08.387726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.467 #50 NEW cov: 12549 ft: 15236 corp: 23/1677b lim: 100 exec/s: 50 rss: 74Mb L: 71/99 MS: 1 InsertByte- 00:08:00.725 [2024-11-30 15:45:08.447692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.725 [2024-11-30 15:45:08.447719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.447764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.725 [2024-11-30 15:45:08.447778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.447828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.725 [2024-11-30 15:45:08.447858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.447908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.725 [2024-11-30 15:45:08.447922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.725 #51 NEW cov: 12549 ft: 15244 corp: 24/1773b lim: 100 exec/s: 51 rss: 74Mb L: 96/99 MS: 1 CopyPart- 00:08:00.725 [2024-11-30 15:45:08.487723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.725 [2024-11-30 15:45:08.487749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.487797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.725 [2024-11-30 15:45:08.487811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.487860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.725 [2024-11-30 15:45:08.487873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.487923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.725 [2024-11-30 15:45:08.487937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.725 #52 NEW cov: 12549 ft: 15252 corp: 25/1871b lim: 100 exec/s: 52 rss: 74Mb L: 98/99 MS: 1 InsertRepeatedBytes- 00:08:00.725 [2024-11-30 15:45:08.547771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.725 [2024-11-30 15:45:08.547797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.547861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.725 [2024-11-30 15:45:08.547875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.547928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.725 [2024-11-30 15:45:08.547942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.547992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.725 [2024-11-30 15:45:08.548006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.725 #53 NEW cov: 12549 ft: 15267 corp: 26/1967b lim: 100 exec/s: 53 rss: 74Mb L: 96/99 MS: 1 ChangeBit- 00:08:00.725 [2024-11-30 15:45:08.607446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.725 [2024-11-30 15:45:08.607472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.725 #54 NEW cov: 12549 ft: 15294 corp: 27/2006b lim: 100 exec/s: 54 rss: 74Mb L: 39/99 MS: 1 InsertByte- 00:08:00.725 [2024-11-30 15:45:08.647786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.725 [2024-11-30 15:45:08.647812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.647875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.725 [2024-11-30 15:45:08.647891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.647942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.725 [2024-11-30 15:45:08.647956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.725 [2024-11-30 15:45:08.648006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.725 [2024-11-30 15:45:08.648021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.725 #55 NEW cov: 12549 ft: 15311 corp: 28/2105b lim: 100 exec/s: 55 rss: 74Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:00.982 [2024-11-30 15:45:08.707816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.982 [2024-11-30 15:45:08.707841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.707891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.982 [2024-11-30 15:45:08.707904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.707954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.982 [2024-11-30 15:45:08.707984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.708036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.982 [2024-11-30 15:45:08.708051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.982 #56 NEW cov: 12549 ft: 15330 corp: 29/2194b lim: 100 exec/s: 56 rss: 74Mb L: 89/99 MS: 1 ChangeBit- 00:08:00.982 [2024-11-30 15:45:08.767718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.982 [2024-11-30 15:45:08.767743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.767779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.982 [2024-11-30 15:45:08.767792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.767844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.982 [2024-11-30 15:45:08.767859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.982 #57 NEW cov: 12549 ft: 15340 corp: 30/2260b lim: 100 exec/s: 57 rss: 74Mb L: 66/99 MS: 1 ChangeBit- 00:08:00.982 [2024-11-30 15:45:08.807589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.982 [2024-11-30 15:45:08.807620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.807659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.982 [2024-11-30 15:45:08.807673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.982 #58 NEW cov: 12549 ft: 15348 corp: 31/2317b lim: 100 exec/s: 58 rss: 74Mb L: 57/99 MS: 1 CrossOver- 00:08:00.982 [2024-11-30 15:45:08.867877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.982 [2024-11-30 15:45:08.867902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.867955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.982 [2024-11-30 15:45:08.867968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.868019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.982 [2024-11-30 15:45:08.868033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.868083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.982 [2024-11-30 15:45:08.868097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.982 #59 NEW cov: 12549 ft: 15354 corp: 32/2413b lim: 100 exec/s: 59 rss: 74Mb L: 96/99 MS: 1 ChangeByte- 00:08:00.982 [2024-11-30 15:45:08.908037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:00.982 [2024-11-30 15:45:08.908062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.908117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:00.982 [2024-11-30 15:45:08.908131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.908180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:00.982 [2024-11-30 15:45:08.908194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.908243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:00.982 [2024-11-30 15:45:08.908255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.982 [2024-11-30 15:45:08.908306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:00.982 [2024-11-30 15:45:08.908320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:01.240 #60 NEW cov: 12549 ft: 15405 corp: 33/2513b lim: 100 exec/s: 60 rss: 74Mb L: 100/100 MS: 1 InsertByte- 00:08:01.240 [2024-11-30 15:45:08.967927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:01.240 [2024-11-30 15:45:08.967952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.240 [2024-11-30 15:45:08.968001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:01.240 [2024-11-30 15:45:08.968015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.240 [2024-11-30 15:45:08.968065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:01.240 [2024-11-30 15:45:08.968095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.240 [2024-11-30 15:45:08.968147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:01.240 [2024-11-30 15:45:08.968162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.240 #61 NEW cov: 12549 ft: 15416 corp: 34/2602b lim: 100 exec/s: 61 rss: 74Mb L: 89/100 MS: 1 ShuffleBytes- 00:08:01.240 [2024-11-30 15:45:09.007904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:01.240 [2024-11-30 15:45:09.007929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.240 [2024-11-30 15:45:09.007979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:01.240 [2024-11-30 15:45:09.007993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.240 [2024-11-30 15:45:09.008042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:01.240 [2024-11-30 15:45:09.008056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.240 [2024-11-30 15:45:09.008106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:01.240 [2024-11-30 15:45:09.008119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.240 #62 NEW cov: 12549 ft: 15465 corp: 35/2683b lim: 100 exec/s: 62 rss: 74Mb L: 81/100 MS: 1 ChangeByte- 00:08:01.240 [2024-11-30 15:45:09.047696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:01.240 [2024-11-30 15:45:09.047722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.240 [2024-11-30 15:45:09.047765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:01.240 [2024-11-30 15:45:09.047779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.240 #63 NEW cov: 12549 ft: 15480 corp: 36/2740b lim: 100 exec/s: 31 rss: 74Mb L: 57/100 MS: 1 ShuffleBytes- 00:08:01.240 #63 DONE cov: 12549 ft: 15480 corp: 36/2740b lim: 100 exec/s: 31 rss: 74Mb 00:08:01.240 ###### Recommended dictionary. ###### 00:08:01.240 "\000\224\306\3114\312 \350" # Uses: 0 00:08:01.240 "\000\224\306\311s2\037\246" # Uses: 0 00:08:01.240 ###### End of recommended dictionary. ###### 00:08:01.240 Done 63 runs in 2 second(s) 00:08:01.240 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:01.240 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:01.240 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.240 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:01.240 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:01.240 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:01.240 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:01.241 15:45:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:01.499 [2024-11-30 15:45:09.207096] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:01.499 [2024-11-30 15:45:09.207164] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1719733 ] 00:08:01.499 [2024-11-30 15:45:09.451202] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:01.757 [2024-11-30 15:45:09.499547] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.757 [2024-11-30 15:45:09.513056] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.757 [2024-11-30 15:45:09.565740] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.757 [2024-11-30 15:45:09.582065] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:01.757 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.757 INFO: Seed: 558351754 00:08:01.757 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:01.757 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:01.757 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:01.757 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.757 #2 INITED exec/s: 0 rss: 64Mb 00:08:01.757 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.757 This may also happen if the target rejected all inputs we tried so far 00:08:01.757 [2024-11-30 15:45:09.637281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3689348814741910323 len:13108 00:08:01.757 [2024-11-30 15:45:09.637319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.757 [2024-11-30 15:45:09.637387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3689348814741910323 len:13108 00:08:01.757 [2024-11-30 15:45:09.637409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.015 NEW_FUNC[1/716]: 0x47f478 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:02.015 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:02.015 #13 NEW cov: 12298 ft: 12297 corp: 2/24b lim: 50 exec/s: 0 rss: 71Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:02.015 [2024-11-30 15:45:09.967451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.015 [2024-11-30 15:45:09.967520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.273 #15 NEW cov: 12413 ft: 13408 corp: 3/41b lim: 50 exec/s: 0 rss: 72Mb L: 17/23 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:02.273 [2024-11-30 15:45:10.017681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.273 [2024-11-30 15:45:10.017714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.273 [2024-11-30 15:45:10.017757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:02.273 [2024-11-30 15:45:10.017777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.273 [2024-11-30 15:45:10.017833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:02.274 [2024-11-30 15:45:10.017848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.017902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:02.274 [2024-11-30 15:45:10.017918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.017976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:02.274 [2024-11-30 15:45:10.017992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:02.274 #17 NEW cov: 12419 ft: 13929 corp: 4/91b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:02.274 [2024-11-30 15:45:10.057427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6655295901103053916 len:23645 00:08:02.274 [2024-11-30 15:45:10.057458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.057491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6655295901103053916 len:23645 00:08:02.274 [2024-11-30 15:45:10.057507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.057564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6655295901103053916 len:23645 00:08:02.274 [2024-11-30 15:45:10.057580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.274 #19 NEW cov: 12504 ft: 14395 corp: 5/122b lim: 50 exec/s: 0 rss: 72Mb L: 31/50 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:02.274 [2024-11-30 15:45:10.097384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3060207161816855347 len:13108 00:08:02.274 [2024-11-30 15:45:10.097415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.097463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3689348814741910323 len:13108 00:08:02.274 [2024-11-30 15:45:10.097480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.274 #24 NEW cov: 12504 ft: 14537 corp: 6/150b lim: 50 exec/s: 0 rss: 72Mb L: 28/50 MS: 5 InsertByte-InsertByte-CrossOver-CopyPart-CrossOver- 00:08:02.274 [2024-11-30 15:45:10.137236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.274 [2024-11-30 15:45:10.137265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.274 #25 NEW cov: 12504 ft: 14619 corp: 7/166b lim: 50 exec/s: 0 rss: 72Mb L: 16/50 MS: 1 EraseBytes- 00:08:02.274 [2024-11-30 15:45:10.197765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.274 [2024-11-30 15:45:10.197793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.197847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:02.274 [2024-11-30 15:45:10.197866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.197920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:10240 00:08:02.274 [2024-11-30 15:45:10.197936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.197991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:02.274 [2024-11-30 15:45:10.198008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.274 [2024-11-30 15:45:10.198063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:02.274 [2024-11-30 15:45:10.198080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:02.531 #26 NEW cov: 12504 ft: 14689 corp: 8/216b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 ChangeByte- 00:08:02.531 [2024-11-30 15:45:10.257295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18408475567394390015 len:65536 00:08:02.531 [2024-11-30 15:45:10.257323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.531 #27 NEW cov: 12504 ft: 14771 corp: 9/233b lim: 50 exec/s: 0 rss: 72Mb L: 17/50 MS: 1 CrossOver- 00:08:02.531 [2024-11-30 15:45:10.297557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6655295901103053916 len:23645 00:08:02.531 [2024-11-30 15:45:10.297585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.531 [2024-11-30 15:45:10.297638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6655295901103053916 len:2816 00:08:02.531 [2024-11-30 15:45:10.297655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.531 [2024-11-30 15:45:10.297710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6701356245527298047 len:65536 00:08:02.531 [2024-11-30 15:45:10.297726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.532 #28 NEW cov: 12504 ft: 14844 corp: 10/272b lim: 50 exec/s: 0 rss: 72Mb L: 39/50 MS: 1 CrossOver- 00:08:02.532 [2024-11-30 15:45:10.357833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.532 [2024-11-30 15:45:10.357860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.357913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:02.532 [2024-11-30 15:45:10.357929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.357982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:10240 00:08:02.532 [2024-11-30 15:45:10.357997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.358051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:02.532 [2024-11-30 15:45:10.358066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.358122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:02.532 [2024-11-30 15:45:10.358141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:02.532 #29 NEW cov: 12504 ft: 14915 corp: 11/322b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 CopyPart- 00:08:02.532 [2024-11-30 15:45:10.417899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.532 [2024-11-30 15:45:10.417927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.417996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:02.532 [2024-11-30 15:45:10.418012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.418069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:02.532 [2024-11-30 15:45:10.418084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.418140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551614 len:65536 00:08:02.532 [2024-11-30 15:45:10.418156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.418210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:02.532 [2024-11-30 15:45:10.418227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:02.532 #35 NEW cov: 12504 ft: 14972 corp: 12/372b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 ChangeBit- 00:08:02.532 [2024-11-30 15:45:10.457497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6655295901103053916 len:23645 00:08:02.532 [2024-11-30 15:45:10.457525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.532 [2024-11-30 15:45:10.457574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6655295901103053916 len:23645 00:08:02.532 [2024-11-30 15:45:10.457592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.532 #36 NEW cov: 12504 ft: 15084 corp: 13/398b lim: 50 exec/s: 0 rss: 72Mb L: 26/50 MS: 1 EraseBytes- 00:08:02.790 [2024-11-30 15:45:10.497384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18408475567394390015 len:23552 00:08:02.790 [2024-11-30 15:45:10.497414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.790 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:02.790 #37 NEW cov: 12527 ft: 15141 corp: 14/415b lim: 50 exec/s: 0 rss: 72Mb L: 17/50 MS: 1 ChangeByte- 00:08:02.790 [2024-11-30 15:45:10.557905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.790 [2024-11-30 15:45:10.557932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.790 [2024-11-30 15:45:10.557988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:02.790 [2024-11-30 15:45:10.558003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.790 [2024-11-30 15:45:10.558072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:02.790 [2024-11-30 15:45:10.558092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.790 [2024-11-30 15:45:10.558146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:02.790 [2024-11-30 15:45:10.558163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.790 [2024-11-30 15:45:10.558216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:02.790 [2024-11-30 15:45:10.558233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:02.790 #38 NEW cov: 12527 ft: 15153 corp: 15/465b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 CopyPart- 00:08:02.790 [2024-11-30 15:45:10.597544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.790 [2024-11-30 15:45:10.597572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.790 [2024-11-30 15:45:10.597628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65291 00:08:02.790 [2024-11-30 15:45:10.597645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.790 #39 NEW cov: 12527 ft: 15185 corp: 16/486b lim: 50 exec/s: 0 rss: 72Mb L: 21/50 MS: 1 CrossOver- 00:08:02.790 [2024-11-30 15:45:10.637909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.790 [2024-11-30 15:45:10.637936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.790 [2024-11-30 15:45:10.638007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18377782704415440895 len:65536 00:08:02.790 [2024-11-30 15:45:10.638024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.790 [2024-11-30 15:45:10.638081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:02.791 [2024-11-30 15:45:10.638097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.791 [2024-11-30 15:45:10.638150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551614 len:65536 00:08:02.791 [2024-11-30 15:45:10.638164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.791 [2024-11-30 15:45:10.638220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:02.791 [2024-11-30 15:45:10.638236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:02.791 #40 NEW cov: 12527 ft: 15212 corp: 17/536b lim: 50 exec/s: 40 rss: 72Mb L: 50/50 MS: 1 CopyPart- 00:08:02.791 [2024-11-30 15:45:10.697506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:02.791 [2024-11-30 15:45:10.697534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.791 #41 NEW cov: 12527 ft: 15244 corp: 18/553b lim: 50 exec/s: 41 rss: 73Mb L: 17/50 MS: 1 CrossOver- 00:08:03.049 [2024-11-30 15:45:10.757778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3060207161816855347 len:13108 00:08:03.049 [2024-11-30 15:45:10.757809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.049 [2024-11-30 15:45:10.757845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3689348814741910323 len:13108 00:08:03.049 [2024-11-30 15:45:10.757859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.049 [2024-11-30 15:45:10.757918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446518897164156927 len:13108 00:08:03.049 [2024-11-30 15:45:10.757934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.049 #42 NEW cov: 12527 ft: 15250 corp: 19/587b lim: 50 exec/s: 42 rss: 73Mb L: 34/50 MS: 1 CrossOver- 00:08:03.049 [2024-11-30 15:45:10.817672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3689348814741910323 len:13108 00:08:03.049 [2024-11-30 15:45:10.817699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.049 [2024-11-30 15:45:10.817751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3689348814741910323 len:13108 00:08:03.049 [2024-11-30 15:45:10.817768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.049 #43 NEW cov: 12527 ft: 15313 corp: 20/611b lim: 50 exec/s: 43 rss: 73Mb L: 24/50 MS: 1 CopyPart- 00:08:03.049 [2024-11-30 15:45:10.878046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:03.049 [2024-11-30 15:45:10.878075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.049 [2024-11-30 15:45:10.878146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:03.049 [2024-11-30 15:45:10.878162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.049 [2024-11-30 15:45:10.878216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:03.049 [2024-11-30 15:45:10.878232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.049 [2024-11-30 15:45:10.878286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709486079 len:65536 00:08:03.049 [2024-11-30 15:45:10.878303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.049 [2024-11-30 15:45:10.878359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:03.049 [2024-11-30 15:45:10.878375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:03.049 #44 NEW cov: 12527 ft: 15383 corp: 21/661b lim: 50 exec/s: 44 rss: 73Mb L: 50/50 MS: 1 CopyPart- 00:08:03.049 [2024-11-30 15:45:10.917573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:03.049 [2024-11-30 15:45:10.917604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.049 #45 NEW cov: 12527 ft: 15390 corp: 22/675b lim: 50 exec/s: 45 rss: 73Mb L: 14/50 MS: 1 EraseBytes- 00:08:03.049 [2024-11-30 15:45:10.957698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:11309 00:08:03.049 [2024-11-30 15:45:10.957727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.049 [2024-11-30 15:45:10.957801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3182967604875373612 len:11309 00:08:03.049 [2024-11-30 15:45:10.957818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.049 #47 NEW cov: 12527 ft: 15393 corp: 23/704b lim: 50 exec/s: 47 rss: 73Mb L: 29/50 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:03.307 [2024-11-30 15:45:11.017635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:03.307 [2024-11-30 15:45:11.017665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.307 #49 NEW cov: 12527 ft: 15401 corp: 24/719b lim: 50 exec/s: 49 rss: 73Mb L: 15/50 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:03.307 [2024-11-30 15:45:11.077785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:03.307 [2024-11-30 15:45:11.077815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.307 [2024-11-30 15:45:11.077865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446742978492891135 len:1 00:08:03.308 [2024-11-30 15:45:11.077882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.308 #50 NEW cov: 12527 ft: 15427 corp: 25/743b lim: 50 exec/s: 50 rss: 73Mb L: 24/50 MS: 1 InsertRepeatedBytes- 00:08:03.308 [2024-11-30 15:45:11.117734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14757395256400792780 len:53044 00:08:03.308 [2024-11-30 15:45:11.117761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.308 [2024-11-30 15:45:11.117800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3689348814741910323 len:13108 00:08:03.308 [2024-11-30 15:45:11.117817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.308 #51 NEW cov: 12527 ft: 15450 corp: 26/766b lim: 50 exec/s: 51 rss: 73Mb L: 23/50 MS: 1 ChangeBinInt- 00:08:03.308 [2024-11-30 15:45:11.157688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:03.308 [2024-11-30 15:45:11.157718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.308 #52 NEW cov: 12527 ft: 15461 corp: 27/783b lim: 50 exec/s: 52 rss: 73Mb L: 17/50 MS: 1 ShuffleBytes- 00:08:03.308 [2024-11-30 15:45:11.198038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:03.308 [2024-11-30 15:45:11.198067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.308 [2024-11-30 15:45:11.198119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:03.308 [2024-11-30 15:45:11.198134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.308 [2024-11-30 15:45:11.198188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551359 len:65536 00:08:03.308 [2024-11-30 15:45:11.198205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.308 [2024-11-30 15:45:11.198258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:03.308 [2024-11-30 15:45:11.198274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.308 #53 NEW cov: 12527 ft: 15525 corp: 28/824b lim: 50 exec/s: 53 rss: 73Mb L: 41/50 MS: 1 EraseBytes- 00:08:03.308 [2024-11-30 15:45:11.258123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6655295901103053916 len:23645 00:08:03.308 [2024-11-30 15:45:11.258152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.308 [2024-11-30 15:45:11.258199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6655295901103053916 len:23645 00:08:03.308 [2024-11-30 15:45:11.258215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.308 [2024-11-30 15:45:11.258269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6655295901103053916 len:23645 00:08:03.308 [2024-11-30 15:45:11.258285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.308 [2024-11-30 15:45:11.258340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:6655295901103053916 len:23645 00:08:03.308 [2024-11-30 15:45:11.258356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.565 #54 NEW cov: 12527 ft: 15620 corp: 29/872b lim: 50 exec/s: 54 rss: 73Mb L: 48/50 MS: 1 CopyPart- 00:08:03.566 [2024-11-30 15:45:11.317899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6655295901103053916 len:23645 00:08:03.566 [2024-11-30 15:45:11.317928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.566 [2024-11-30 15:45:11.317983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6655295901103053916 len:23645 00:08:03.566 [2024-11-30 15:45:11.317999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.566 #55 NEW cov: 12527 ft: 15625 corp: 30/896b lim: 50 exec/s: 55 rss: 73Mb L: 24/50 MS: 1 EraseBytes- 00:08:03.566 [2024-11-30 15:45:11.358023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6655295901103053916 len:23645 00:08:03.566 [2024-11-30 15:45:11.358051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.566 [2024-11-30 15:45:11.358092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:6655295901103053916 len:2816 00:08:03.566 [2024-11-30 15:45:11.358108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.566 [2024-11-30 15:45:11.358166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6701356236937363455 len:65536 00:08:03.566 [2024-11-30 15:45:11.358181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.566 #56 NEW cov: 12527 ft: 15653 corp: 31/935b lim: 50 exec/s: 56 rss: 73Mb L: 39/50 MS: 1 ChangeBinInt- 00:08:03.566 [2024-11-30 15:45:11.417919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:03.566 [2024-11-30 15:45:11.417947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.566 [2024-11-30 15:45:11.417997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446742978492891135 len:1 00:08:03.566 [2024-11-30 15:45:11.418014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.566 #57 NEW cov: 12527 ft: 15662 corp: 32/959b lim: 50 exec/s: 57 rss: 73Mb L: 24/50 MS: 1 CopyPart- 00:08:03.566 [2024-11-30 15:45:11.478353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:03.566 [2024-11-30 15:45:11.478381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.566 [2024-11-30 15:45:11.478438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:03.566 [2024-11-30 15:45:11.478451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.566 [2024-11-30 15:45:11.478507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:10240 00:08:03.566 [2024-11-30 15:45:11.478522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.566 [2024-11-30 15:45:11.478579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:03.566 [2024-11-30 15:45:11.478594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.566 [2024-11-30 15:45:11.478657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:03.566 [2024-11-30 15:45:11.478672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:03.566 #58 NEW cov: 12527 ft: 15672 corp: 33/1009b lim: 50 exec/s: 58 rss: 74Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:03.826 [2024-11-30 15:45:11.537947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18377634270329634815 len:23552 00:08:03.826 [2024-11-30 15:45:11.537978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.826 #59 NEW cov: 12527 ft: 15690 corp: 34/1026b lim: 50 exec/s: 59 rss: 74Mb L: 17/50 MS: 1 ShuffleBytes- 00:08:03.826 [2024-11-30 15:45:11.578085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65320 00:08:03.826 [2024-11-30 15:45:11.578114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.826 [2024-11-30 15:45:11.578172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:03.826 [2024-11-30 15:45:11.578187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.826 [2024-11-30 15:45:11.578245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:03.826 [2024-11-30 15:45:11.578261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.826 #60 NEW cov: 12527 ft: 15694 corp: 35/1057b lim: 50 exec/s: 60 rss: 74Mb L: 31/50 MS: 1 EraseBytes- 00:08:03.826 [2024-11-30 15:45:11.618332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:08:03.826 [2024-11-30 15:45:11.618361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.826 [2024-11-30 15:45:11.618407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:03.826 [2024-11-30 15:45:11.618423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.826 [2024-11-30 15:45:11.618476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709543423 len:65536 00:08:03.826 [2024-11-30 15:45:11.618514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.826 [2024-11-30 15:45:11.618570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:03.826 [2024-11-30 15:45:11.618585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.826 [2024-11-30 15:45:11.618646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:03.826 [2024-11-30 15:45:11.618662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:03.826 #61 NEW cov: 12527 ft: 15709 corp: 36/1107b lim: 50 exec/s: 30 rss: 74Mb L: 50/50 MS: 1 ChangeBit- 00:08:03.826 #61 DONE cov: 12527 ft: 15709 corp: 36/1107b lim: 50 exec/s: 30 rss: 74Mb 00:08:03.826 Done 61 runs in 2 second(s) 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:03.826 15:45:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:03.826 [2024-11-30 15:45:11.785617] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:03.826 [2024-11-30 15:45:11.785689] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1720273 ] 00:08:04.395 [2024-11-30 15:45:12.101074] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:04.395 [2024-11-30 15:45:12.147733] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.395 [2024-11-30 15:45:12.170855] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.395 [2024-11-30 15:45:12.223093] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.395 [2024-11-30 15:45:12.239412] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:04.395 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.395 INFO: Seed: 3213338355 00:08:04.395 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:04.395 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:04.395 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:04.395 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.395 #2 INITED exec/s: 0 rss: 64Mb 00:08:04.396 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.396 This may also happen if the target rejected all inputs we tried so far 00:08:04.396 [2024-11-30 15:45:12.287972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.396 [2024-11-30 15:45:12.288000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.655 NEW_FUNC[1/718]: 0x481038 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:04.655 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:04.655 #3 NEW cov: 12357 ft: 12328 corp: 2/21b lim: 90 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:08:04.655 [2024-11-30 15:45:12.608340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.655 [2024-11-30 15:45:12.608376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.655 [2024-11-30 15:45:12.608434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.655 [2024-11-30 15:45:12.608452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.655 [2024-11-30 15:45:12.608507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:04.655 [2024-11-30 15:45:12.608523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.915 #16 NEW cov: 12471 ft: 13654 corp: 3/90b lim: 90 exec/s: 0 rss: 72Mb L: 69/69 MS: 3 InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:04.915 [2024-11-30 15:45:12.647968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.915 [2024-11-30 15:45:12.647995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.915 #17 NEW cov: 12477 ft: 13844 corp: 4/110b lim: 90 exec/s: 0 rss: 72Mb L: 20/69 MS: 1 CMP- DE: "\377\377\377\013"- 00:08:04.915 [2024-11-30 15:45:12.708450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.915 [2024-11-30 15:45:12.708478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.915 [2024-11-30 15:45:12.708527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.915 [2024-11-30 15:45:12.708543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.915 [2024-11-30 15:45:12.708603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:04.915 [2024-11-30 15:45:12.708620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.915 [2024-11-30 15:45:12.708676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:04.915 [2024-11-30 15:45:12.708693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.915 #21 NEW cov: 12562 ft: 14369 corp: 5/198b lim: 90 exec/s: 0 rss: 72Mb L: 88/88 MS: 4 ShuffleBytes-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:04.915 [2024-11-30 15:45:12.748004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.915 [2024-11-30 15:45:12.748033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.915 #22 NEW cov: 12562 ft: 14536 corp: 6/218b lim: 90 exec/s: 0 rss: 72Mb L: 20/88 MS: 1 PersAutoDict- DE: "\377\377\377\013"- 00:08:04.915 [2024-11-30 15:45:12.788185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.915 [2024-11-30 15:45:12.788215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.915 [2024-11-30 15:45:12.788267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:04.915 [2024-11-30 15:45:12.788283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.915 #23 NEW cov: 12562 ft: 14868 corp: 7/258b lim: 90 exec/s: 0 rss: 72Mb L: 40/88 MS: 1 CrossOver- 00:08:04.915 [2024-11-30 15:45:12.828039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:04.915 [2024-11-30 15:45:12.828068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.915 #24 NEW cov: 12562 ft: 15074 corp: 8/278b lim: 90 exec/s: 0 rss: 72Mb L: 20/88 MS: 1 ChangeBit- 00:08:05.175 [2024-11-30 15:45:12.888075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.175 [2024-11-30 15:45:12.888103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.175 #25 NEW cov: 12562 ft: 15180 corp: 9/298b lim: 90 exec/s: 0 rss: 72Mb L: 20/88 MS: 1 CopyPart- 00:08:05.175 [2024-11-30 15:45:12.928095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.175 [2024-11-30 15:45:12.928123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.175 #26 NEW cov: 12562 ft: 15190 corp: 10/319b lim: 90 exec/s: 0 rss: 72Mb L: 21/88 MS: 1 InsertByte- 00:08:05.175 [2024-11-30 15:45:12.968610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.175 [2024-11-30 15:45:12.968638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.175 [2024-11-30 15:45:12.968687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:05.175 [2024-11-30 15:45:12.968704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.175 [2024-11-30 15:45:12.968760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:05.175 [2024-11-30 15:45:12.968777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.175 [2024-11-30 15:45:12.968830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:05.175 [2024-11-30 15:45:12.968846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.175 #27 NEW cov: 12562 ft: 15308 corp: 11/404b lim: 90 exec/s: 0 rss: 72Mb L: 85/88 MS: 1 InsertRepeatedBytes- 00:08:05.175 [2024-11-30 15:45:13.008316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.175 [2024-11-30 15:45:13.008344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.175 [2024-11-30 15:45:13.008408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:05.175 [2024-11-30 15:45:13.008425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.175 #28 NEW cov: 12562 ft: 15376 corp: 12/453b lim: 90 exec/s: 0 rss: 72Mb L: 49/88 MS: 1 CrossOver- 00:08:05.175 [2024-11-30 15:45:13.068192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.175 [2024-11-30 15:45:13.068219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.175 #29 NEW cov: 12562 ft: 15409 corp: 13/474b lim: 90 exec/s: 0 rss: 72Mb L: 21/88 MS: 1 ChangeBit- 00:08:05.175 [2024-11-30 15:45:13.128215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.175 [2024-11-30 15:45:13.128243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.435 #30 NEW cov: 12562 ft: 15453 corp: 14/495b lim: 90 exec/s: 0 rss: 72Mb L: 21/88 MS: 1 PersAutoDict- DE: "\377\377\377\013"- 00:08:05.435 [2024-11-30 15:45:13.168241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.435 [2024-11-30 15:45:13.168269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.435 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:05.435 #31 NEW cov: 12585 ft: 15508 corp: 15/515b lim: 90 exec/s: 0 rss: 73Mb L: 20/88 MS: 1 CopyPart- 00:08:05.435 [2024-11-30 15:45:13.228286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.435 [2024-11-30 15:45:13.228314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.435 #32 NEW cov: 12585 ft: 15519 corp: 16/533b lim: 90 exec/s: 0 rss: 73Mb L: 18/88 MS: 1 EraseBytes- 00:08:05.435 [2024-11-30 15:45:13.268239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.435 [2024-11-30 15:45:13.268267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.435 #33 NEW cov: 12585 ft: 15550 corp: 17/554b lim: 90 exec/s: 33 rss: 73Mb L: 21/88 MS: 1 InsertByte- 00:08:05.435 [2024-11-30 15:45:13.308737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.435 [2024-11-30 15:45:13.308765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.435 [2024-11-30 15:45:13.308817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:05.435 [2024-11-30 15:45:13.308834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.435 [2024-11-30 15:45:13.308889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:05.435 [2024-11-30 15:45:13.308905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.435 [2024-11-30 15:45:13.308962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:05.435 [2024-11-30 15:45:13.308978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.435 #34 NEW cov: 12585 ft: 15584 corp: 18/637b lim: 90 exec/s: 34 rss: 73Mb L: 83/88 MS: 1 InsertRepeatedBytes- 00:08:05.435 [2024-11-30 15:45:13.368463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.435 [2024-11-30 15:45:13.368490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.435 [2024-11-30 15:45:13.368547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:05.435 [2024-11-30 15:45:13.368563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.695 [2024-11-30 15:45:13.428494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.695 [2024-11-30 15:45:13.428520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.695 [2024-11-30 15:45:13.428559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:05.695 [2024-11-30 15:45:13.428574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.695 #36 NEW cov: 12585 ft: 15653 corp: 19/686b lim: 90 exec/s: 36 rss: 73Mb L: 49/88 MS: 2 PersAutoDict-ChangeByte- DE: "\377\377\377\013"- 00:08:05.695 [2024-11-30 15:45:13.468340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.695 [2024-11-30 15:45:13.468368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.695 #37 NEW cov: 12585 ft: 15671 corp: 20/708b lim: 90 exec/s: 37 rss: 73Mb L: 22/88 MS: 1 CrossOver- 00:08:05.695 [2024-11-30 15:45:13.508348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.695 [2024-11-30 15:45:13.508375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.695 #38 NEW cov: 12585 ft: 15730 corp: 21/729b lim: 90 exec/s: 38 rss: 73Mb L: 21/88 MS: 1 CopyPart- 00:08:05.695 [2024-11-30 15:45:13.548397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.695 [2024-11-30 15:45:13.548424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.695 #39 NEW cov: 12585 ft: 15762 corp: 22/751b lim: 90 exec/s: 39 rss: 73Mb L: 22/88 MS: 1 InsertByte- 00:08:05.695 [2024-11-30 15:45:13.588407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.695 [2024-11-30 15:45:13.588436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.695 #40 NEW cov: 12585 ft: 15775 corp: 23/776b lim: 90 exec/s: 40 rss: 73Mb L: 25/88 MS: 1 PersAutoDict- DE: "\377\377\377\013"- 00:08:05.695 [2024-11-30 15:45:13.648418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.695 [2024-11-30 15:45:13.648445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.956 #41 NEW cov: 12585 ft: 15847 corp: 24/798b lim: 90 exec/s: 41 rss: 73Mb L: 22/88 MS: 1 ChangeBit- 00:08:05.956 [2024-11-30 15:45:13.709058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.956 [2024-11-30 15:45:13.709084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.956 [2024-11-30 15:45:13.709140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:05.956 [2024-11-30 15:45:13.709155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.956 [2024-11-30 15:45:13.709224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:05.956 [2024-11-30 15:45:13.709240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.956 [2024-11-30 15:45:13.709295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:05.956 [2024-11-30 15:45:13.709314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.956 [2024-11-30 15:45:13.709369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:05.956 [2024-11-30 15:45:13.709384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:05.956 #42 NEW cov: 12585 ft: 15919 corp: 25/888b lim: 90 exec/s: 42 rss: 73Mb L: 90/90 MS: 1 CrossOver- 00:08:05.956 [2024-11-30 15:45:13.768755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.956 [2024-11-30 15:45:13.768781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.956 [2024-11-30 15:45:13.768827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:05.956 [2024-11-30 15:45:13.768845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.956 [2024-11-30 15:45:13.768899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:05.956 [2024-11-30 15:45:13.768913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.956 #43 NEW cov: 12585 ft: 15966 corp: 26/945b lim: 90 exec/s: 43 rss: 73Mb L: 57/90 MS: 1 InsertRepeatedBytes- 00:08:05.956 [2024-11-30 15:45:13.808460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.956 [2024-11-30 15:45:13.808487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.956 #44 NEW cov: 12585 ft: 15973 corp: 27/965b lim: 90 exec/s: 44 rss: 73Mb L: 20/90 MS: 1 PersAutoDict- DE: "\377\377\377\013"- 00:08:05.956 [2024-11-30 15:45:13.848530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.956 [2024-11-30 15:45:13.848558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.956 #45 NEW cov: 12585 ft: 16017 corp: 28/985b lim: 90 exec/s: 45 rss: 73Mb L: 20/90 MS: 1 ChangeBinInt- 00:08:05.956 [2024-11-30 15:45:13.908516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:05.956 [2024-11-30 15:45:13.908544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.215 #46 NEW cov: 12585 ft: 16036 corp: 29/1010b lim: 90 exec/s: 46 rss: 73Mb L: 25/90 MS: 1 CMP- DE: "\377\377\000\000"- 00:08:06.215 [2024-11-30 15:45:13.948864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:06.215 [2024-11-30 15:45:13.948892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.215 [2024-11-30 15:45:13.948935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:06.215 [2024-11-30 15:45:13.948950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.215 [2024-11-30 15:45:13.949004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:06.215 [2024-11-30 15:45:13.949019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.215 #47 NEW cov: 12585 ft: 16075 corp: 30/1067b lim: 90 exec/s: 47 rss: 74Mb L: 57/90 MS: 1 CMP- DE: "\000\224\306\314\327B\265\020"- 00:08:06.215 [2024-11-30 15:45:14.008568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:06.215 [2024-11-30 15:45:14.008596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.215 #48 NEW cov: 12585 ft: 16137 corp: 31/1087b lim: 90 exec/s: 48 rss: 74Mb L: 20/90 MS: 1 ShuffleBytes- 00:08:06.215 [2024-11-30 15:45:14.068771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:06.215 [2024-11-30 15:45:14.068798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.215 [2024-11-30 15:45:14.068835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:06.216 [2024-11-30 15:45:14.068852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.216 #49 NEW cov: 12585 ft: 16147 corp: 32/1136b lim: 90 exec/s: 49 rss: 74Mb L: 49/90 MS: 1 ShuffleBytes- 00:08:06.216 [2024-11-30 15:45:14.108761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:06.216 [2024-11-30 15:45:14.108790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.216 [2024-11-30 15:45:14.108833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:06.216 [2024-11-30 15:45:14.108848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.216 #50 NEW cov: 12585 ft: 16162 corp: 33/1185b lim: 90 exec/s: 50 rss: 74Mb L: 49/90 MS: 1 ChangeBit- 00:08:06.216 [2024-11-30 15:45:14.168614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:06.216 [2024-11-30 15:45:14.168643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.475 #51 NEW cov: 12585 ft: 16168 corp: 34/1209b lim: 90 exec/s: 51 rss: 74Mb L: 24/90 MS: 1 CopyPart- 00:08:06.475 [2024-11-30 15:45:14.208614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:06.475 [2024-11-30 15:45:14.208641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.475 #52 NEW cov: 12585 ft: 16225 corp: 35/1234b lim: 90 exec/s: 52 rss: 74Mb L: 25/90 MS: 1 ChangeByte- 00:08:06.475 [2024-11-30 15:45:14.268973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:06.475 [2024-11-30 15:45:14.269001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.475 [2024-11-30 15:45:14.269045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:06.475 [2024-11-30 15:45:14.269061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.475 [2024-11-30 15:45:14.269118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:06.475 [2024-11-30 15:45:14.269151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.476 #53 NEW cov: 12585 ft: 16243 corp: 36/1303b lim: 90 exec/s: 26 rss: 74Mb L: 69/90 MS: 1 ChangeBit- 00:08:06.476 #53 DONE cov: 12585 ft: 16243 corp: 36/1303b lim: 90 exec/s: 26 rss: 74Mb 00:08:06.476 ###### Recommended dictionary. ###### 00:08:06.476 "\377\377\377\013" # Uses: 5 00:08:06.476 "\377\377\000\000" # Uses: 0 00:08:06.476 "\000\224\306\314\327B\265\020" # Uses: 0 00:08:06.476 ###### End of recommended dictionary. ###### 00:08:06.476 Done 53 runs in 2 second(s) 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:06.476 15:45:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:06.821 [2024-11-30 15:45:14.458943] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:06.821 [2024-11-30 15:45:14.459016] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1720704 ] 00:08:07.173 [2024-11-30 15:45:14.774279] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:07.173 [2024-11-30 15:45:14.821552] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.173 [2024-11-30 15:45:14.842762] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.173 [2024-11-30 15:45:14.895000] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.173 [2024-11-30 15:45:14.911327] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:07.173 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.173 INFO: Seed: 1591373447 00:08:07.173 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:07.173 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:07.173 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:07.173 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.173 #2 INITED exec/s: 0 rss: 64Mb 00:08:07.173 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.173 This may also happen if the target rejected all inputs we tried so far 00:08:07.173 [2024-11-30 15:45:14.956533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.174 [2024-11-30 15:45:14.956562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.174 [2024-11-30 15:45:14.956618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.174 [2024-11-30 15:45:14.956634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.433 NEW_FUNC[1/718]: 0x484268 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:07.433 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:07.433 #5 NEW cov: 12333 ft: 12332 corp: 2/22b lim: 50 exec/s: 0 rss: 71Mb L: 21/21 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:07.433 [2024-11-30 15:45:15.266815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.433 [2024-11-30 15:45:15.266855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.433 [2024-11-30 15:45:15.266920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.433 [2024-11-30 15:45:15.266940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.433 [2024-11-30 15:45:15.267002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:07.433 [2024-11-30 15:45:15.267021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.433 #19 NEW cov: 12446 ft: 13150 corp: 3/53b lim: 50 exec/s: 0 rss: 72Mb L: 31/31 MS: 4 InsertByte-ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:07.433 [2024-11-30 15:45:15.306389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.433 [2024-11-30 15:45:15.306417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.433 #20 NEW cov: 12452 ft: 14059 corp: 4/69b lim: 50 exec/s: 0 rss: 72Mb L: 16/31 MS: 1 EraseBytes- 00:08:07.433 [2024-11-30 15:45:15.366435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.433 [2024-11-30 15:45:15.366463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.433 #23 NEW cov: 12537 ft: 14345 corp: 5/83b lim: 50 exec/s: 0 rss: 72Mb L: 14/31 MS: 3 ChangeByte-ShuffleBytes-CrossOver- 00:08:07.692 [2024-11-30 15:45:15.406605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.692 [2024-11-30 15:45:15.406633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.692 [2024-11-30 15:45:15.406670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.692 [2024-11-30 15:45:15.406686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.692 #26 NEW cov: 12537 ft: 14545 corp: 6/107b lim: 50 exec/s: 0 rss: 72Mb L: 24/31 MS: 3 ChangeByte-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:07.692 [2024-11-30 15:45:15.446467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.692 [2024-11-30 15:45:15.446495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.692 #27 NEW cov: 12537 ft: 14657 corp: 7/121b lim: 50 exec/s: 0 rss: 72Mb L: 14/31 MS: 1 ChangeBinInt- 00:08:07.692 [2024-11-30 15:45:15.506664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.692 [2024-11-30 15:45:15.506693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.692 [2024-11-30 15:45:15.506731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.692 [2024-11-30 15:45:15.506746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.692 #28 NEW cov: 12537 ft: 14771 corp: 8/142b lim: 50 exec/s: 0 rss: 72Mb L: 21/31 MS: 1 EraseBytes- 00:08:07.692 [2024-11-30 15:45:15.566538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.692 [2024-11-30 15:45:15.566568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.692 #29 NEW cov: 12537 ft: 14780 corp: 9/161b lim: 50 exec/s: 0 rss: 72Mb L: 19/31 MS: 1 EraseBytes- 00:08:07.692 [2024-11-30 15:45:15.606818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.692 [2024-11-30 15:45:15.606846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.692 [2024-11-30 15:45:15.606881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.692 [2024-11-30 15:45:15.606898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.692 [2024-11-30 15:45:15.606954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:07.693 [2024-11-30 15:45:15.606967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.693 #30 NEW cov: 12537 ft: 14855 corp: 10/194b lim: 50 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:07.952 [2024-11-30 15:45:15.666603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.952 [2024-11-30 15:45:15.666632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.952 #31 NEW cov: 12537 ft: 14903 corp: 11/213b lim: 50 exec/s: 0 rss: 72Mb L: 19/33 MS: 1 ChangeByte- 00:08:07.952 [2024-11-30 15:45:15.706592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.952 [2024-11-30 15:45:15.706625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.953 #32 NEW cov: 12537 ft: 14915 corp: 12/229b lim: 50 exec/s: 0 rss: 72Mb L: 16/33 MS: 1 ChangeByte- 00:08:07.953 [2024-11-30 15:45:15.767087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.953 [2024-11-30 15:45:15.767115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.953 [2024-11-30 15:45:15.767160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.953 [2024-11-30 15:45:15.767175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.953 [2024-11-30 15:45:15.767228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:07.953 [2024-11-30 15:45:15.767243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.953 [2024-11-30 15:45:15.767299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:07.953 [2024-11-30 15:45:15.767313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.953 #33 NEW cov: 12537 ft: 15254 corp: 13/269b lim: 50 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:07.953 [2024-11-30 15:45:15.806923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.953 [2024-11-30 15:45:15.806951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.953 [2024-11-30 15:45:15.806992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.953 [2024-11-30 15:45:15.807007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.953 [2024-11-30 15:45:15.807063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:07.953 [2024-11-30 15:45:15.807082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.953 #34 NEW cov: 12537 ft: 15267 corp: 14/304b lim: 50 exec/s: 0 rss: 72Mb L: 35/40 MS: 1 CopyPart- 00:08:07.953 [2024-11-30 15:45:15.847091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.953 [2024-11-30 15:45:15.847119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.953 [2024-11-30 15:45:15.847169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:07.953 [2024-11-30 15:45:15.847184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.953 [2024-11-30 15:45:15.847240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:07.953 [2024-11-30 15:45:15.847257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.953 [2024-11-30 15:45:15.847310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:07.953 [2024-11-30 15:45:15.847326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.953 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:07.953 #35 NEW cov: 12560 ft: 15309 corp: 15/344b lim: 50 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:07.953 [2024-11-30 15:45:15.906655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:07.953 [2024-11-30 15:45:15.906682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.213 #36 NEW cov: 12560 ft: 15359 corp: 16/363b lim: 50 exec/s: 0 rss: 73Mb L: 19/40 MS: 1 ChangeBinInt- 00:08:08.213 [2024-11-30 15:45:15.947141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.213 [2024-11-30 15:45:15.947170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:15.947226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.213 [2024-11-30 15:45:15.947242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:15.947298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.213 [2024-11-30 15:45:15.947314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:15.947369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:08.213 [2024-11-30 15:45:15.947385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.213 #37 NEW cov: 12560 ft: 15441 corp: 17/403b lim: 50 exec/s: 37 rss: 73Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:08.213 [2024-11-30 15:45:16.007004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.213 [2024-11-30 15:45:16.007033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:16.007070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.213 [2024-11-30 15:45:16.007087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:16.007143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.213 [2024-11-30 15:45:16.007159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.213 #38 NEW cov: 12560 ft: 15458 corp: 18/438b lim: 50 exec/s: 38 rss: 73Mb L: 35/40 MS: 1 ChangeByte- 00:08:08.213 [2024-11-30 15:45:16.067074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.213 [2024-11-30 15:45:16.067102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:16.067148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.213 [2024-11-30 15:45:16.067163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:16.067217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.213 [2024-11-30 15:45:16.067234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.213 #39 NEW cov: 12560 ft: 15490 corp: 19/473b lim: 50 exec/s: 39 rss: 73Mb L: 35/40 MS: 1 ShuffleBytes- 00:08:08.213 [2024-11-30 15:45:16.107205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.213 [2024-11-30 15:45:16.107234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:16.107275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.213 [2024-11-30 15:45:16.107291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:16.107346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.213 [2024-11-30 15:45:16.107361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:16.107433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:08.213 [2024-11-30 15:45:16.107457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.213 #40 NEW cov: 12560 ft: 15500 corp: 20/515b lim: 50 exec/s: 40 rss: 73Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:08:08.213 [2024-11-30 15:45:16.166918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.213 [2024-11-30 15:45:16.166947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.213 [2024-11-30 15:45:16.166999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.213 [2024-11-30 15:45:16.167017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.473 #41 NEW cov: 12560 ft: 15533 corp: 21/539b lim: 50 exec/s: 41 rss: 73Mb L: 24/42 MS: 1 ChangeByte- 00:08:08.473 [2024-11-30 15:45:16.227114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.473 [2024-11-30 15:45:16.227142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.473 [2024-11-30 15:45:16.227187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.473 [2024-11-30 15:45:16.227203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.473 [2024-11-30 15:45:16.227256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.473 [2024-11-30 15:45:16.227271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.473 #42 NEW cov: 12560 ft: 15549 corp: 22/575b lim: 50 exec/s: 42 rss: 73Mb L: 36/42 MS: 1 CopyPart- 00:08:08.473 [2024-11-30 15:45:16.266947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.473 [2024-11-30 15:45:16.266974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.473 [2024-11-30 15:45:16.267025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.473 [2024-11-30 15:45:16.267039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.473 #43 NEW cov: 12560 ft: 15600 corp: 23/599b lim: 50 exec/s: 43 rss: 73Mb L: 24/42 MS: 1 ChangeBinInt- 00:08:08.473 [2024-11-30 15:45:16.306798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.473 [2024-11-30 15:45:16.306824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.473 #44 NEW cov: 12560 ft: 15623 corp: 24/618b lim: 50 exec/s: 44 rss: 73Mb L: 19/42 MS: 1 ChangeBinInt- 00:08:08.473 [2024-11-30 15:45:16.346813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.473 [2024-11-30 15:45:16.346857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.473 #45 NEW cov: 12560 ft: 15648 corp: 25/637b lim: 50 exec/s: 45 rss: 73Mb L: 19/42 MS: 1 CMP- DE: "\001\000\000e"- 00:08:08.473 [2024-11-30 15:45:16.407018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.473 [2024-11-30 15:45:16.407045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.473 [2024-11-30 15:45:16.407110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.473 [2024-11-30 15:45:16.407126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.473 #46 NEW cov: 12560 ft: 15704 corp: 26/657b lim: 50 exec/s: 46 rss: 73Mb L: 20/42 MS: 1 EraseBytes- 00:08:08.732 [2024-11-30 15:45:16.446871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.732 [2024-11-30 15:45:16.446898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.732 #47 NEW cov: 12560 ft: 15735 corp: 27/676b lim: 50 exec/s: 47 rss: 73Mb L: 19/42 MS: 1 ChangeBinInt- 00:08:08.732 [2024-11-30 15:45:16.487015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.732 [2024-11-30 15:45:16.487044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.732 [2024-11-30 15:45:16.487095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.732 [2024-11-30 15:45:16.487111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.732 #48 NEW cov: 12560 ft: 15748 corp: 28/696b lim: 50 exec/s: 48 rss: 73Mb L: 20/42 MS: 1 InsertByte- 00:08:08.732 [2024-11-30 15:45:16.546939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.732 [2024-11-30 15:45:16.546967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.732 #49 NEW cov: 12560 ft: 15764 corp: 29/715b lim: 50 exec/s: 49 rss: 73Mb L: 19/42 MS: 1 PersAutoDict- DE: "\001\000\000e"- 00:08:08.732 [2024-11-30 15:45:16.607297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.732 [2024-11-30 15:45:16.607324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.732 [2024-11-30 15:45:16.607376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.732 [2024-11-30 15:45:16.607395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.732 [2024-11-30 15:45:16.607450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.732 [2024-11-30 15:45:16.607467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.732 #50 NEW cov: 12560 ft: 15771 corp: 30/751b lim: 50 exec/s: 50 rss: 73Mb L: 36/42 MS: 1 ChangeBit- 00:08:08.732 [2024-11-30 15:45:16.667181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.732 [2024-11-30 15:45:16.667209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.732 [2024-11-30 15:45:16.667247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.732 [2024-11-30 15:45:16.667263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.732 #51 NEW cov: 12560 ft: 15812 corp: 31/772b lim: 50 exec/s: 51 rss: 73Mb L: 21/42 MS: 1 EraseBytes- 00:08:08.992 [2024-11-30 15:45:16.707508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.992 [2024-11-30 15:45:16.707537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.992 [2024-11-30 15:45:16.707586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.992 [2024-11-30 15:45:16.707605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.992 [2024-11-30 15:45:16.707661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.992 [2024-11-30 15:45:16.707676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.992 [2024-11-30 15:45:16.707732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:08.992 [2024-11-30 15:45:16.707747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.992 #52 NEW cov: 12560 ft: 15834 corp: 32/812b lim: 50 exec/s: 52 rss: 74Mb L: 40/42 MS: 1 ChangeBinInt- 00:08:08.992 [2024-11-30 15:45:16.747237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.992 [2024-11-30 15:45:16.747265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.992 [2024-11-30 15:45:16.747317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.992 [2024-11-30 15:45:16.747332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.992 #55 NEW cov: 12560 ft: 15862 corp: 33/840b lim: 50 exec/s: 55 rss: 74Mb L: 28/42 MS: 3 ChangeBit-CMP-CrossOver- DE: "\013\000\000\000\000\000\000\000"- 00:08:08.992 [2024-11-30 15:45:16.787384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.992 [2024-11-30 15:45:16.787411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.992 [2024-11-30 15:45:16.787447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.992 [2024-11-30 15:45:16.787464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.992 [2024-11-30 15:45:16.787520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.992 [2024-11-30 15:45:16.787538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.992 #56 NEW cov: 12560 ft: 15864 corp: 34/875b lim: 50 exec/s: 56 rss: 74Mb L: 35/42 MS: 1 ShuffleBytes- 00:08:08.992 [2024-11-30 15:45:16.847373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.992 [2024-11-30 15:45:16.847401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.992 [2024-11-30 15:45:16.847455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:08.992 [2024-11-30 15:45:16.847469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.992 [2024-11-30 15:45:16.847527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:08.992 [2024-11-30 15:45:16.847544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.992 #57 NEW cov: 12560 ft: 15872 corp: 35/908b lim: 50 exec/s: 57 rss: 74Mb L: 33/42 MS: 1 ShuffleBytes- 00:08:08.992 [2024-11-30 15:45:16.907118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:08.992 [2024-11-30 15:45:16.907147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.992 #58 NEW cov: 12560 ft: 15878 corp: 36/927b lim: 50 exec/s: 29 rss: 74Mb L: 19/42 MS: 1 ShuffleBytes- 00:08:08.992 #58 DONE cov: 12560 ft: 15878 corp: 36/927b lim: 50 exec/s: 29 rss: 74Mb 00:08:08.992 ###### Recommended dictionary. ###### 00:08:08.992 "\001\000\000e" # Uses: 1 00:08:08.992 "\013\000\000\000\000\000\000\000" # Uses: 0 00:08:08.992 ###### End of recommended dictionary. ###### 00:08:08.992 Done 58 runs in 2 second(s) 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:09.253 15:45:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:09.253 [2024-11-30 15:45:17.098281] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:09.253 [2024-11-30 15:45:17.098374] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1721103 ] 00:08:09.513 [2024-11-30 15:45:17.416210] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:09.513 [2024-11-30 15:45:17.463090] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.772 [2024-11-30 15:45:17.481048] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.772 [2024-11-30 15:45:17.533446] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:09.772 [2024-11-30 15:45:17.549789] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:09.772 INFO: Running with entropic power schedule (0xFF, 100). 00:08:09.772 INFO: Seed: 4229379999 00:08:09.772 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:09.772 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:09.772 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:09.772 INFO: A corpus is not provided, starting from an empty corpus 00:08:09.772 #2 INITED exec/s: 0 rss: 64Mb 00:08:09.772 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:09.772 This may also happen if the target rejected all inputs we tried so far 00:08:09.772 [2024-11-30 15:45:17.616271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:09.772 [2024-11-30 15:45:17.616314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.031 NEW_FUNC[1/718]: 0x486538 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:10.031 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.031 #3 NEW cov: 12358 ft: 12329 corp: 2/31b lim: 85 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:10.031 [2024-11-30 15:45:17.955710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.031 [2024-11-30 15:45:17.955764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.291 #4 NEW cov: 12472 ft: 12960 corp: 3/61b lim: 85 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 ChangeByte- 00:08:10.291 [2024-11-30 15:45:18.025569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.291 [2024-11-30 15:45:18.025604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.291 #10 NEW cov: 12478 ft: 13188 corp: 4/90b lim: 85 exec/s: 0 rss: 72Mb L: 29/30 MS: 1 EraseBytes- 00:08:10.291 [2024-11-30 15:45:18.095611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.291 [2024-11-30 15:45:18.095641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.291 #11 NEW cov: 12563 ft: 13577 corp: 5/119b lim: 85 exec/s: 0 rss: 72Mb L: 29/30 MS: 1 ShuffleBytes- 00:08:10.291 [2024-11-30 15:45:18.165629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.291 [2024-11-30 15:45:18.165657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.291 #12 NEW cov: 12563 ft: 13645 corp: 6/136b lim: 85 exec/s: 0 rss: 72Mb L: 17/30 MS: 1 EraseBytes- 00:08:10.291 [2024-11-30 15:45:18.236443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.291 [2024-11-30 15:45:18.236478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.291 [2024-11-30 15:45:18.236566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:10.291 [2024-11-30 15:45:18.236587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.291 [2024-11-30 15:45:18.236723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:10.291 [2024-11-30 15:45:18.236747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.291 [2024-11-30 15:45:18.236875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:10.291 [2024-11-30 15:45:18.236894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.550 #21 NEW cov: 12563 ft: 14567 corp: 7/215b lim: 85 exec/s: 0 rss: 72Mb L: 79/79 MS: 4 ChangeByte-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:10.550 [2024-11-30 15:45:18.296460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.550 [2024-11-30 15:45:18.296491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.550 [2024-11-30 15:45:18.296603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:10.550 [2024-11-30 15:45:18.296624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.550 [2024-11-30 15:45:18.296756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:10.550 [2024-11-30 15:45:18.296780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.550 [2024-11-30 15:45:18.296899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:10.550 [2024-11-30 15:45:18.296923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.550 #22 NEW cov: 12563 ft: 14679 corp: 8/294b lim: 85 exec/s: 0 rss: 72Mb L: 79/79 MS: 1 ShuffleBytes- 00:08:10.550 [2024-11-30 15:45:18.365756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.550 [2024-11-30 15:45:18.365781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.550 #23 NEW cov: 12563 ft: 14723 corp: 9/323b lim: 85 exec/s: 0 rss: 73Mb L: 29/79 MS: 1 ChangeBit- 00:08:10.550 [2024-11-30 15:45:18.415937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.550 [2024-11-30 15:45:18.415963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.550 #24 NEW cov: 12563 ft: 14789 corp: 10/340b lim: 85 exec/s: 0 rss: 73Mb L: 17/79 MS: 1 ChangeByte- 00:08:10.550 [2024-11-30 15:45:18.486105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.550 [2024-11-30 15:45:18.486138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.550 [2024-11-30 15:45:18.486259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:10.550 [2024-11-30 15:45:18.486282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.809 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:10.809 #25 NEW cov: 12586 ft: 15252 corp: 11/374b lim: 85 exec/s: 0 rss: 73Mb L: 34/79 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:10.809 [2024-11-30 15:45:18.545956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.809 [2024-11-30 15:45:18.545988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.809 #26 NEW cov: 12586 ft: 15335 corp: 12/404b lim: 85 exec/s: 0 rss: 73Mb L: 30/79 MS: 1 InsertByte- 00:08:10.809 [2024-11-30 15:45:18.596757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.809 [2024-11-30 15:45:18.596789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.809 [2024-11-30 15:45:18.596856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:10.809 [2024-11-30 15:45:18.596877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.809 [2024-11-30 15:45:18.596999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:10.809 [2024-11-30 15:45:18.597024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.809 [2024-11-30 15:45:18.597151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:10.809 [2024-11-30 15:45:18.597174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.809 #27 NEW cov: 12586 ft: 15376 corp: 13/483b lim: 85 exec/s: 27 rss: 73Mb L: 79/79 MS: 1 CopyPart- 00:08:10.809 [2024-11-30 15:45:18.645997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.809 [2024-11-30 15:45:18.646028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.809 #28 NEW cov: 12586 ft: 15404 corp: 14/501b lim: 85 exec/s: 28 rss: 73Mb L: 18/79 MS: 1 InsertByte- 00:08:10.809 [2024-11-30 15:45:18.716874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.809 [2024-11-30 15:45:18.716907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.809 [2024-11-30 15:45:18.717014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:10.809 [2024-11-30 15:45:18.717034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.809 [2024-11-30 15:45:18.717164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:10.809 [2024-11-30 15:45:18.717183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.809 [2024-11-30 15:45:18.717309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:10.810 [2024-11-30 15:45:18.717329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.810 #29 NEW cov: 12586 ft: 15416 corp: 15/582b lim: 85 exec/s: 29 rss: 73Mb L: 81/81 MS: 1 CopyPart- 00:08:10.810 [2024-11-30 15:45:18.766126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:10.810 [2024-11-30 15:45:18.766153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.069 #30 NEW cov: 12586 ft: 15479 corp: 16/612b lim: 85 exec/s: 30 rss: 73Mb L: 30/81 MS: 1 ShuffleBytes- 00:08:11.069 [2024-11-30 15:45:18.816127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.069 [2024-11-30 15:45:18.816154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.069 #31 NEW cov: 12586 ft: 15534 corp: 17/630b lim: 85 exec/s: 31 rss: 73Mb L: 18/81 MS: 1 ChangeByte- 00:08:11.069 [2024-11-30 15:45:18.886542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.069 [2024-11-30 15:45:18.886571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.069 [2024-11-30 15:45:18.886712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:11.069 [2024-11-30 15:45:18.886733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.069 #32 NEW cov: 12586 ft: 15574 corp: 18/667b lim: 85 exec/s: 32 rss: 73Mb L: 37/81 MS: 1 InsertRepeatedBytes- 00:08:11.069 [2024-11-30 15:45:18.956583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.069 [2024-11-30 15:45:18.956620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.069 [2024-11-30 15:45:18.956735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:11.069 [2024-11-30 15:45:18.956761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.069 #33 NEW cov: 12586 ft: 15586 corp: 19/711b lim: 85 exec/s: 33 rss: 73Mb L: 44/81 MS: 1 CrossOver- 00:08:11.069 [2024-11-30 15:45:19.006288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.069 [2024-11-30 15:45:19.006321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.329 #34 NEW cov: 12586 ft: 15598 corp: 20/730b lim: 85 exec/s: 34 rss: 73Mb L: 19/81 MS: 1 CrossOver- 00:08:11.329 [2024-11-30 15:45:19.076385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.329 [2024-11-30 15:45:19.076417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.329 #35 NEW cov: 12586 ft: 15625 corp: 21/755b lim: 85 exec/s: 35 rss: 73Mb L: 25/81 MS: 1 EraseBytes- 00:08:11.329 [2024-11-30 15:45:19.127392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.329 [2024-11-30 15:45:19.127426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.329 [2024-11-30 15:45:19.127523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:11.329 [2024-11-30 15:45:19.127548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.329 [2024-11-30 15:45:19.127678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:11.329 [2024-11-30 15:45:19.127706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.329 [2024-11-30 15:45:19.127828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:11.329 [2024-11-30 15:45:19.127852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.329 [2024-11-30 15:45:19.127973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:11.329 [2024-11-30 15:45:19.127996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:11.329 #36 NEW cov: 12586 ft: 15694 corp: 22/840b lim: 85 exec/s: 36 rss: 73Mb L: 85/85 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:11.329 [2024-11-30 15:45:19.197239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.329 [2024-11-30 15:45:19.197275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.329 [2024-11-30 15:45:19.197370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:11.329 [2024-11-30 15:45:19.197395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.329 [2024-11-30 15:45:19.197512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:11.329 [2024-11-30 15:45:19.197536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.329 [2024-11-30 15:45:19.197661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:11.329 [2024-11-30 15:45:19.197681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.329 #37 NEW cov: 12586 ft: 15729 corp: 23/910b lim: 85 exec/s: 37 rss: 73Mb L: 70/85 MS: 1 EraseBytes- 00:08:11.329 [2024-11-30 15:45:19.246415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.329 [2024-11-30 15:45:19.246444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.329 #38 NEW cov: 12586 ft: 15791 corp: 24/940b lim: 85 exec/s: 38 rss: 73Mb L: 30/85 MS: 1 ChangeByte- 00:08:11.590 [2024-11-30 15:45:19.297249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.590 [2024-11-30 15:45:19.297280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.297349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:11.590 [2024-11-30 15:45:19.297373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.297497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:11.590 [2024-11-30 15:45:19.297518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.297645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:11.590 [2024-11-30 15:45:19.297666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.590 #39 NEW cov: 12586 ft: 15817 corp: 25/1013b lim: 85 exec/s: 39 rss: 73Mb L: 73/85 MS: 1 InsertRepeatedBytes- 00:08:11.590 [2024-11-30 15:45:19.347271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.590 [2024-11-30 15:45:19.347302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.347363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:11.590 [2024-11-30 15:45:19.347386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.347504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:11.590 [2024-11-30 15:45:19.347529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.347656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:11.590 [2024-11-30 15:45:19.347679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.590 #40 NEW cov: 12586 ft: 15826 corp: 26/1097b lim: 85 exec/s: 40 rss: 73Mb L: 84/85 MS: 1 CopyPart- 00:08:11.590 [2024-11-30 15:45:19.416588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.590 [2024-11-30 15:45:19.416628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.590 #41 NEW cov: 12586 ft: 15846 corp: 27/1127b lim: 85 exec/s: 41 rss: 73Mb L: 30/85 MS: 1 InsertByte- 00:08:11.590 [2024-11-30 15:45:19.466928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.590 [2024-11-30 15:45:19.466955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.590 #42 NEW cov: 12586 ft: 15857 corp: 28/1146b lim: 85 exec/s: 42 rss: 74Mb L: 19/85 MS: 1 EraseBytes- 00:08:11.590 [2024-11-30 15:45:19.537361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.590 [2024-11-30 15:45:19.537391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.537474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:11.590 [2024-11-30 15:45:19.537496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.537614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:11.590 [2024-11-30 15:45:19.537644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.590 [2024-11-30 15:45:19.537761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:11.590 [2024-11-30 15:45:19.537786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.850 #43 NEW cov: 12586 ft: 15867 corp: 29/1217b lim: 85 exec/s: 43 rss: 74Mb L: 71/85 MS: 1 CopyPart- 00:08:11.850 [2024-11-30 15:45:19.606670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:11.850 [2024-11-30 15:45:19.606704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.850 #44 NEW cov: 12586 ft: 15871 corp: 30/1246b lim: 85 exec/s: 22 rss: 74Mb L: 29/85 MS: 1 ChangeByte- 00:08:11.850 #44 DONE cov: 12586 ft: 15871 corp: 30/1246b lim: 85 exec/s: 22 rss: 74Mb 00:08:11.850 ###### Recommended dictionary. ###### 00:08:11.850 "\001\000\000\000" # Uses: 1 00:08:11.850 ###### End of recommended dictionary. ###### 00:08:11.850 Done 44 runs in 2 second(s) 00:08:11.850 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:11.850 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:11.850 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.850 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:11.850 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:11.850 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:11.851 15:45:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:11.851 [2024-11-30 15:45:19.775462] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:11.851 [2024-11-30 15:45:19.775529] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1721637 ] 00:08:12.419 [2024-11-30 15:45:20.089638] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:12.419 [2024-11-30 15:45:20.137389] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.419 [2024-11-30 15:45:20.157331] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.419 [2024-11-30 15:45:20.210102] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.419 [2024-11-30 15:45:20.226428] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:12.419 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.419 INFO: Seed: 2611433578 00:08:12.419 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:12.419 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:12.419 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:12.419 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.419 #2 INITED exec/s: 0 rss: 64Mb 00:08:12.419 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.419 This may also happen if the target rejected all inputs we tried so far 00:08:12.419 [2024-11-30 15:45:20.271950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.419 [2024-11-30 15:45:20.271983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.419 [2024-11-30 15:45:20.272024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:12.419 [2024-11-30 15:45:20.272040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.419 [2024-11-30 15:45:20.272095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:12.419 [2024-11-30 15:45:20.272111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.419 [2024-11-30 15:45:20.272164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:12.419 [2024-11-30 15:45:20.272180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.419 [2024-11-30 15:45:20.272235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:12.419 [2024-11-30 15:45:20.272250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.679 NEW_FUNC[1/716]: 0x489778 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:12.679 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:12.679 #10 NEW cov: 12289 ft: 12285 corp: 2/26b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:12.679 [2024-11-30 15:45:20.612088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.679 [2024-11-30 15:45:20.612133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.679 [2024-11-30 15:45:20.612202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:12.679 [2024-11-30 15:45:20.612222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.679 [2024-11-30 15:45:20.612289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:12.679 [2024-11-30 15:45:20.612308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.679 [2024-11-30 15:45:20.612374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:12.679 [2024-11-30 15:45:20.612393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.679 NEW_FUNC[1/1]: 0x19e4078 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1190 00:08:12.679 #15 NEW cov: 12405 ft: 13034 corp: 3/50b lim: 25 exec/s: 0 rss: 72Mb L: 24/25 MS: 5 CrossOver-InsertByte-ShuffleBytes-ChangeByte-CrossOver- 00:08:12.946 [2024-11-30 15:45:20.651819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.946 [2024-11-30 15:45:20.651850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.946 [2024-11-30 15:45:20.651903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:12.946 [2024-11-30 15:45:20.651920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.946 [2024-11-30 15:45:20.651982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:12.946 [2024-11-30 15:45:20.651998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.947 #16 NEW cov: 12411 ft: 13741 corp: 4/69b lim: 25 exec/s: 0 rss: 72Mb L: 19/25 MS: 1 InsertRepeatedBytes- 00:08:12.947 [2024-11-30 15:45:20.691684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.947 [2024-11-30 15:45:20.691727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.691788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:12.947 [2024-11-30 15:45:20.691806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.947 #17 NEW cov: 12496 ft: 14251 corp: 5/83b lim: 25 exec/s: 0 rss: 72Mb L: 14/25 MS: 1 EraseBytes- 00:08:12.947 [2024-11-30 15:45:20.751989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.947 [2024-11-30 15:45:20.752018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.752070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:12.947 [2024-11-30 15:45:20.752086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.752149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:12.947 [2024-11-30 15:45:20.752165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.752228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:12.947 [2024-11-30 15:45:20.752245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.947 #18 NEW cov: 12496 ft: 14353 corp: 6/107b lim: 25 exec/s: 0 rss: 73Mb L: 24/25 MS: 1 CopyPart- 00:08:12.947 [2024-11-30 15:45:20.811864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.947 [2024-11-30 15:45:20.811892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.811957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:12.947 [2024-11-30 15:45:20.811973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.812035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:12.947 [2024-11-30 15:45:20.812051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.947 #19 NEW cov: 12496 ft: 14453 corp: 7/124b lim: 25 exec/s: 0 rss: 73Mb L: 17/25 MS: 1 EraseBytes- 00:08:12.947 [2024-11-30 15:45:20.852030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:12.947 [2024-11-30 15:45:20.852058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.852111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:12.947 [2024-11-30 15:45:20.852127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.852186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:12.947 [2024-11-30 15:45:20.852201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.947 [2024-11-30 15:45:20.852263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:12.947 [2024-11-30 15:45:20.852280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.947 #20 NEW cov: 12496 ft: 14516 corp: 8/148b lim: 25 exec/s: 0 rss: 73Mb L: 24/25 MS: 1 ChangeBit- 00:08:13.205 [2024-11-30 15:45:20.911933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.205 [2024-11-30 15:45:20.911962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:20.912014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.205 [2024-11-30 15:45:20.912030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:20.912092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.205 [2024-11-30 15:45:20.912108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.205 #21 NEW cov: 12496 ft: 14635 corp: 9/165b lim: 25 exec/s: 0 rss: 73Mb L: 17/25 MS: 1 CrossOver- 00:08:13.205 [2024-11-30 15:45:20.972112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.205 [2024-11-30 15:45:20.972143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:20.972215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.205 [2024-11-30 15:45:20.972232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:20.972290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.205 [2024-11-30 15:45:20.972306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:20.972365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.205 [2024-11-30 15:45:20.972382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.205 #23 NEW cov: 12496 ft: 14714 corp: 10/188b lim: 25 exec/s: 0 rss: 73Mb L: 23/25 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:13.205 [2024-11-30 15:45:21.031832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.205 [2024-11-30 15:45:21.031858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:21.031915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.205 [2024-11-30 15:45:21.031933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.205 #24 NEW cov: 12496 ft: 14772 corp: 11/202b lim: 25 exec/s: 0 rss: 73Mb L: 14/25 MS: 1 ChangeBit- 00:08:13.205 [2024-11-30 15:45:21.091857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.205 [2024-11-30 15:45:21.091884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:21.091952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.205 [2024-11-30 15:45:21.091969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.205 #25 NEW cov: 12496 ft: 14805 corp: 12/216b lim: 25 exec/s: 0 rss: 73Mb L: 14/25 MS: 1 ShuffleBytes- 00:08:13.205 [2024-11-30 15:45:21.151982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.205 [2024-11-30 15:45:21.152010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:21.152063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.205 [2024-11-30 15:45:21.152080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.205 [2024-11-30 15:45:21.152138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.205 [2024-11-30 15:45:21.152155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.464 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:13.464 #26 NEW cov: 12519 ft: 14850 corp: 13/231b lim: 25 exec/s: 0 rss: 73Mb L: 15/25 MS: 1 InsertByte- 00:08:13.464 [2024-11-30 15:45:21.212056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.464 [2024-11-30 15:45:21.212083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.464 [2024-11-30 15:45:21.212146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.464 [2024-11-30 15:45:21.212162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.464 [2024-11-30 15:45:21.212226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.464 [2024-11-30 15:45:21.212242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.464 #27 NEW cov: 12519 ft: 14884 corp: 14/248b lim: 25 exec/s: 0 rss: 73Mb L: 17/25 MS: 1 CopyPart- 00:08:13.464 [2024-11-30 15:45:21.272158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.464 [2024-11-30 15:45:21.272185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.272259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.465 [2024-11-30 15:45:21.272275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.272335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.465 [2024-11-30 15:45:21.272352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.272411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.465 [2024-11-30 15:45:21.272427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.465 #28 NEW cov: 12519 ft: 14957 corp: 15/272b lim: 25 exec/s: 28 rss: 73Mb L: 24/25 MS: 1 CMP- DE: "\377\005"- 00:08:13.465 [2024-11-30 15:45:21.312027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.465 [2024-11-30 15:45:21.312055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.312108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.465 [2024-11-30 15:45:21.312124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.312187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.465 [2024-11-30 15:45:21.312203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.465 #29 NEW cov: 12519 ft: 14968 corp: 16/289b lim: 25 exec/s: 29 rss: 73Mb L: 17/25 MS: 1 ChangeBinInt- 00:08:13.465 [2024-11-30 15:45:21.352215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.465 [2024-11-30 15:45:21.352242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.352315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.465 [2024-11-30 15:45:21.352332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.352391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.465 [2024-11-30 15:45:21.352408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.352467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.465 [2024-11-30 15:45:21.352482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.465 #32 NEW cov: 12519 ft: 15002 corp: 17/310b lim: 25 exec/s: 32 rss: 73Mb L: 21/25 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:08:13.465 [2024-11-30 15:45:21.392383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.465 [2024-11-30 15:45:21.392414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.392486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.465 [2024-11-30 15:45:21.392503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.392561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.465 [2024-11-30 15:45:21.392577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.392634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.465 [2024-11-30 15:45:21.392650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.465 [2024-11-30 15:45:21.392710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:13.465 [2024-11-30 15:45:21.392725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:13.724 #33 NEW cov: 12519 ft: 15028 corp: 18/335b lim: 25 exec/s: 33 rss: 74Mb L: 25/25 MS: 1 CopyPart- 00:08:13.724 [2024-11-30 15:45:21.452409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.724 [2024-11-30 15:45:21.452436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.724 [2024-11-30 15:45:21.452513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.724 [2024-11-30 15:45:21.452530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.724 [2024-11-30 15:45:21.452589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.724 [2024-11-30 15:45:21.452610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.724 [2024-11-30 15:45:21.452670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.724 [2024-11-30 15:45:21.452686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.724 [2024-11-30 15:45:21.452747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:13.724 [2024-11-30 15:45:21.452763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:13.724 #34 NEW cov: 12519 ft: 15100 corp: 19/360b lim: 25 exec/s: 34 rss: 74Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:13.724 [2024-11-30 15:45:21.492363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.724 [2024-11-30 15:45:21.492391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.724 [2024-11-30 15:45:21.492462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.724 [2024-11-30 15:45:21.492479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.724 [2024-11-30 15:45:21.492538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.724 [2024-11-30 15:45:21.492554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.724 [2024-11-30 15:45:21.492616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.724 [2024-11-30 15:45:21.492635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.724 #35 NEW cov: 12519 ft: 15121 corp: 20/383b lim: 25 exec/s: 35 rss: 74Mb L: 23/25 MS: 1 CopyPart- 00:08:13.724 [2024-11-30 15:45:21.552311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.724 [2024-11-30 15:45:21.552339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.724 [2024-11-30 15:45:21.552411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.724 [2024-11-30 15:45:21.552428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.725 [2024-11-30 15:45:21.552486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.725 [2024-11-30 15:45:21.552503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.725 [2024-11-30 15:45:21.552563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.725 [2024-11-30 15:45:21.552579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.725 #36 NEW cov: 12519 ft: 15139 corp: 21/407b lim: 25 exec/s: 36 rss: 74Mb L: 24/25 MS: 1 CrossOver- 00:08:13.725 [2024-11-30 15:45:21.612095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.725 [2024-11-30 15:45:21.612123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.725 [2024-11-30 15:45:21.612181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.725 [2024-11-30 15:45:21.612199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.725 #37 NEW cov: 12519 ft: 15165 corp: 22/421b lim: 25 exec/s: 37 rss: 74Mb L: 14/25 MS: 1 PersAutoDict- DE: "\377\005"- 00:08:13.725 [2024-11-30 15:45:21.652130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.725 [2024-11-30 15:45:21.652158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.725 [2024-11-30 15:45:21.652207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.725 [2024-11-30 15:45:21.652224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.725 #38 NEW cov: 12519 ft: 15174 corp: 23/435b lim: 25 exec/s: 38 rss: 74Mb L: 14/25 MS: 1 PersAutoDict- DE: "\377\005"- 00:08:13.984 [2024-11-30 15:45:21.692575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.984 [2024-11-30 15:45:21.692607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.984 [2024-11-30 15:45:21.692670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.984 [2024-11-30 15:45:21.692685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.984 [2024-11-30 15:45:21.692747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.984 [2024-11-30 15:45:21.692762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.984 [2024-11-30 15:45:21.692824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.985 [2024-11-30 15:45:21.692839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.692903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:13.985 [2024-11-30 15:45:21.692919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:13.985 #39 NEW cov: 12519 ft: 15215 corp: 24/460b lim: 25 exec/s: 39 rss: 74Mb L: 25/25 MS: 1 ChangeBit- 00:08:13.985 [2024-11-30 15:45:21.752185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.985 [2024-11-30 15:45:21.752213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.752273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.985 [2024-11-30 15:45:21.752290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.985 #40 NEW cov: 12519 ft: 15270 corp: 25/474b lim: 25 exec/s: 40 rss: 74Mb L: 14/25 MS: 1 ChangeBit- 00:08:13.985 [2024-11-30 15:45:21.812603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.985 [2024-11-30 15:45:21.812632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.812694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.985 [2024-11-30 15:45:21.812709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.812771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.985 [2024-11-30 15:45:21.812788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.812846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.985 [2024-11-30 15:45:21.812863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.812922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:13.985 [2024-11-30 15:45:21.812938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:13.985 #41 NEW cov: 12519 ft: 15341 corp: 26/499b lim: 25 exec/s: 41 rss: 74Mb L: 25/25 MS: 1 CrossOver- 00:08:13.985 [2024-11-30 15:45:21.872487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.985 [2024-11-30 15:45:21.872517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.872574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.985 [2024-11-30 15:45:21.872591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.872658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.985 [2024-11-30 15:45:21.872675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.872738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.985 [2024-11-30 15:45:21.872753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.985 #42 NEW cov: 12519 ft: 15342 corp: 27/522b lim: 25 exec/s: 42 rss: 74Mb L: 23/25 MS: 1 CopyPart- 00:08:13.985 [2024-11-30 15:45:21.912614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:13.985 [2024-11-30 15:45:21.912646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.912717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:13.985 [2024-11-30 15:45:21.912734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.912791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:13.985 [2024-11-30 15:45:21.912808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.912870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:13.985 [2024-11-30 15:45:21.912887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.985 [2024-11-30 15:45:21.912949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:13.985 [2024-11-30 15:45:21.912965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.245 #43 NEW cov: 12519 ft: 15357 corp: 28/547b lim: 25 exec/s: 43 rss: 74Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:14.245 [2024-11-30 15:45:21.972538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:14.245 [2024-11-30 15:45:21.972567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:21.972625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:14.245 [2024-11-30 15:45:21.972642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:21.972702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:14.245 [2024-11-30 15:45:21.972718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:21.972776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:14.245 [2024-11-30 15:45:21.972790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.245 #44 NEW cov: 12519 ft: 15399 corp: 29/571b lim: 25 exec/s: 44 rss: 74Mb L: 24/25 MS: 1 ChangeByte- 00:08:14.245 [2024-11-30 15:45:22.032560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:14.245 [2024-11-30 15:45:22.032588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.032682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:14.245 [2024-11-30 15:45:22.032698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.032777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:14.245 [2024-11-30 15:45:22.032791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.032854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:14.245 [2024-11-30 15:45:22.032871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.245 #45 NEW cov: 12519 ft: 15402 corp: 30/594b lim: 25 exec/s: 45 rss: 74Mb L: 23/25 MS: 1 ShuffleBytes- 00:08:14.245 [2024-11-30 15:45:22.072413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:14.245 [2024-11-30 15:45:22.072445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.072484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:14.245 [2024-11-30 15:45:22.072499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.072561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:14.245 [2024-11-30 15:45:22.072578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.245 #46 NEW cov: 12519 ft: 15412 corp: 31/609b lim: 25 exec/s: 46 rss: 74Mb L: 15/25 MS: 1 InsertByte- 00:08:14.245 [2024-11-30 15:45:22.112710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:14.245 [2024-11-30 15:45:22.112738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.112797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:14.245 [2024-11-30 15:45:22.112812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.112870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:14.245 [2024-11-30 15:45:22.112886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.112946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:14.245 [2024-11-30 15:45:22.112962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.245 [2024-11-30 15:45:22.113022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:14.245 [2024-11-30 15:45:22.113038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.245 #47 NEW cov: 12519 ft: 15432 corp: 32/634b lim: 25 exec/s: 47 rss: 74Mb L: 25/25 MS: 1 CopyPart- 00:08:14.245 [2024-11-30 15:45:22.172249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:14.245 [2024-11-30 15:45:22.172277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.245 #50 NEW cov: 12519 ft: 15785 corp: 33/639b lim: 25 exec/s: 50 rss: 74Mb L: 5/25 MS: 3 ShuffleBytes-PersAutoDict-PersAutoDict- DE: "\377\005"-"\377\005"- 00:08:14.505 [2024-11-30 15:45:22.212539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:14.505 [2024-11-30 15:45:22.212568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.505 [2024-11-30 15:45:22.212627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:14.505 [2024-11-30 15:45:22.212643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.505 [2024-11-30 15:45:22.212705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:14.505 [2024-11-30 15:45:22.212721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.505 #51 NEW cov: 12519 ft: 15792 corp: 34/656b lim: 25 exec/s: 51 rss: 75Mb L: 17/25 MS: 1 ChangeByte- 00:08:14.505 [2024-11-30 15:45:22.272787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:14.505 [2024-11-30 15:45:22.272815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.505 [2024-11-30 15:45:22.272876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:14.505 [2024-11-30 15:45:22.272892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.505 [2024-11-30 15:45:22.272953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:14.505 [2024-11-30 15:45:22.272969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.505 [2024-11-30 15:45:22.273028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:14.505 [2024-11-30 15:45:22.273044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.505 [2024-11-30 15:45:22.273106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:14.505 [2024-11-30 15:45:22.273123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.505 #52 NEW cov: 12519 ft: 15802 corp: 35/681b lim: 25 exec/s: 26 rss: 75Mb L: 25/25 MS: 1 CopyPart- 00:08:14.505 #52 DONE cov: 12519 ft: 15802 corp: 35/681b lim: 25 exec/s: 26 rss: 75Mb 00:08:14.505 ###### Recommended dictionary. ###### 00:08:14.505 "\377\005" # Uses: 4 00:08:14.505 ###### End of recommended dictionary. ###### 00:08:14.505 Done 52 runs in 2 second(s) 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:14.505 15:45:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:14.505 [2024-11-30 15:45:22.461779] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:14.505 [2024-11-30 15:45:22.461854] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1722172 ] 00:08:15.074 [2024-11-30 15:45:22.776882] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:15.074 [2024-11-30 15:45:22.822275] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.074 [2024-11-30 15:45:22.845408] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.074 [2024-11-30 15:45:22.897798] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.074 [2024-11-30 15:45:22.914116] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:15.074 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.074 INFO: Seed: 1003437978 00:08:15.074 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:15.074 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:15.074 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:15.074 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.074 #2 INITED exec/s: 0 rss: 64Mb 00:08:15.074 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.074 This may also happen if the target rejected all inputs we tried so far 00:08:15.074 [2024-11-30 15:45:22.963531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.075 [2024-11-30 15:45:22.963562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.075 [2024-11-30 15:45:22.963617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.075 [2024-11-30 15:45:22.963634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.075 [2024-11-30 15:45:22.963690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.075 [2024-11-30 15:45:22.963707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.075 [2024-11-30 15:45:22.963763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.075 [2024-11-30 15:45:22.963779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.334 NEW_FUNC[1/718]: 0x48a868 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:15.334 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:15.335 #5 NEW cov: 12364 ft: 12362 corp: 2/85b lim: 100 exec/s: 0 rss: 72Mb L: 84/84 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:08:15.335 [2024-11-30 15:45:23.283268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808861878429607 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.335 [2024-11-30 15:45:23.283302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.335 [2024-11-30 15:45:23.283364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.335 [2024-11-30 15:45:23.283382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.595 #9 NEW cov: 12477 ft: 13452 corp: 3/126b lim: 100 exec/s: 0 rss: 72Mb L: 41/84 MS: 4 InsertByte-ShuffleBytes-InsertRepeatedBytes-CrossOver- 00:08:15.595 [2024-11-30 15:45:23.323592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.323630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.323670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.323685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.323744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.323760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.323819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.323833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.595 #15 NEW cov: 12483 ft: 13702 corp: 4/211b lim: 100 exec/s: 0 rss: 72Mb L: 85/85 MS: 1 CopyPart- 00:08:15.595 [2024-11-30 15:45:23.383613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.383643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.383689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.383705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.383762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.383777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.383834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.383851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.595 #16 NEW cov: 12568 ft: 13976 corp: 5/295b lim: 100 exec/s: 0 rss: 72Mb L: 84/85 MS: 1 ShuffleBytes- 00:08:15.595 [2024-11-30 15:45:23.423638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.423667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.423721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.423738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.423794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.423810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.423866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.423882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.595 #17 NEW cov: 12568 ft: 14059 corp: 6/379b lim: 100 exec/s: 0 rss: 72Mb L: 84/85 MS: 1 ChangeByte- 00:08:15.595 [2024-11-30 15:45:23.483690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.483719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.483774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.483791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.483850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.483866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.483923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.483939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.595 #18 NEW cov: 12568 ft: 14136 corp: 7/463b lim: 100 exec/s: 0 rss: 72Mb L: 84/85 MS: 1 ChangeByte- 00:08:15.595 [2024-11-30 15:45:23.523659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.523688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.523742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.523758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.523815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.523829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.595 [2024-11-30 15:45:23.523885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.595 [2024-11-30 15:45:23.523903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.854 #19 NEW cov: 12568 ft: 14259 corp: 8/547b lim: 100 exec/s: 0 rss: 72Mb L: 84/85 MS: 1 ChangeBit- 00:08:15.854 [2024-11-30 15:45:23.583694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.583722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.854 [2024-11-30 15:45:23.583781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.583797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.854 [2024-11-30 15:45:23.583871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.583887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.854 [2024-11-30 15:45:23.583946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18374403900871474942 len:65279 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.583963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.854 #20 NEW cov: 12568 ft: 14386 corp: 9/646b lim: 100 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:15.854 [2024-11-30 15:45:23.623699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.623728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.854 [2024-11-30 15:45:23.623780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.623797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.854 [2024-11-30 15:45:23.623855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7595718147998050665 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.623869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.854 [2024-11-30 15:45:23.623925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.623939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.854 #21 NEW cov: 12568 ft: 14443 corp: 10/744b lim: 100 exec/s: 0 rss: 72Mb L: 98/99 MS: 1 InsertRepeatedBytes- 00:08:15.854 [2024-11-30 15:45:23.663732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.854 [2024-11-30 15:45:23.663761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.855 [2024-11-30 15:45:23.663813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.663830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.855 [2024-11-30 15:45:23.663887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.663903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.855 [2024-11-30 15:45:23.663961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.663976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.855 #22 NEW cov: 12568 ft: 14468 corp: 11/829b lim: 100 exec/s: 0 rss: 72Mb L: 85/99 MS: 1 CrossOver- 00:08:15.855 [2024-11-30 15:45:23.723609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.723637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.855 [2024-11-30 15:45:23.723686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.723702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.855 [2024-11-30 15:45:23.723756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.723771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.855 #23 NEW cov: 12568 ft: 14758 corp: 12/892b lim: 100 exec/s: 0 rss: 72Mb L: 63/99 MS: 1 EraseBytes- 00:08:15.855 [2024-11-30 15:45:23.783806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.783834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.855 [2024-11-30 15:45:23.783890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.783906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.855 [2024-11-30 15:45:23.783963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.783979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.855 [2024-11-30 15:45:23.784036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:12033 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.855 [2024-11-30 15:45:23.784053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.855 #24 NEW cov: 12568 ft: 14821 corp: 13/977b lim: 100 exec/s: 0 rss: 72Mb L: 85/99 MS: 1 InsertByte- 00:08:16.114 [2024-11-30 15:45:23.823515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.823544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:23.823604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.823621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.114 #25 NEW cov: 12568 ft: 14849 corp: 14/1031b lim: 100 exec/s: 0 rss: 72Mb L: 54/99 MS: 1 InsertRepeatedBytes- 00:08:16.114 [2024-11-30 15:45:23.863883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772415 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.863912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:23.863983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.863999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:23.864059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7595718147998050665 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.864076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:23.864133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.864149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.114 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:16.114 #26 NEW cov: 12591 ft: 14889 corp: 15/1130b lim: 100 exec/s: 0 rss: 73Mb L: 99/99 MS: 1 InsertByte- 00:08:16.114 [2024-11-30 15:45:23.923533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:12080808866173396903 len:42920 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.923562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:23.923614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.923630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.114 #27 NEW cov: 12591 ft: 14922 corp: 16/1171b lim: 100 exec/s: 27 rss: 73Mb L: 41/99 MS: 1 ChangeBinInt- 00:08:16.114 [2024-11-30 15:45:23.983917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.983946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:23.984016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.984033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:23.984090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.984107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:23.984165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:23.984181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.114 #28 NEW cov: 12591 ft: 14951 corp: 17/1255b lim: 100 exec/s: 28 rss: 73Mb L: 84/99 MS: 1 ShuffleBytes- 00:08:16.114 [2024-11-30 15:45:24.023916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.114 [2024-11-30 15:45:24.023944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.114 [2024-11-30 15:45:24.024017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.115 [2024-11-30 15:45:24.024033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.115 [2024-11-30 15:45:24.024092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.115 [2024-11-30 15:45:24.024106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.115 [2024-11-30 15:45:24.024166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.115 [2024-11-30 15:45:24.024183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.115 #29 NEW cov: 12591 ft: 14970 corp: 18/1341b lim: 100 exec/s: 29 rss: 73Mb L: 86/99 MS: 1 InsertByte- 00:08:16.374 [2024-11-30 15:45:24.083995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.084023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.084097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.084114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.084171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:201863462912 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.084190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.084248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.084265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.374 #30 NEW cov: 12591 ft: 15005 corp: 19/1425b lim: 100 exec/s: 30 rss: 73Mb L: 84/99 MS: 1 CopyPart- 00:08:16.374 [2024-11-30 15:45:24.143972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.144000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.144059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.144073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.144148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.144164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.144221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.144236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.374 #31 NEW cov: 12591 ft: 15021 corp: 20/1510b lim: 100 exec/s: 31 rss: 73Mb L: 85/99 MS: 1 ChangeASCIIInt- 00:08:16.374 [2024-11-30 15:45:24.183961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.183988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.184046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.184062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.184120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:201863462912 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.184136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.184195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:14848 len:12033 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.184211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.374 #32 NEW cov: 12591 ft: 15086 corp: 21/1595b lim: 100 exec/s: 32 rss: 73Mb L: 85/99 MS: 1 InsertByte- 00:08:16.374 [2024-11-30 15:45:24.244021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1761607680 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.244050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.244121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.244137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.244198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.244214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.374 [2024-11-30 15:45:24.244273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.374 [2024-11-30 15:45:24.244290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.374 #33 NEW cov: 12591 ft: 15163 corp: 22/1680b lim: 100 exec/s: 33 rss: 73Mb L: 85/99 MS: 1 ChangeByte- 00:08:16.374 [2024-11-30 15:45:24.284037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.375 [2024-11-30 15:45:24.284065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.375 [2024-11-30 15:45:24.284136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570560 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.375 [2024-11-30 15:45:24.284153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.375 [2024-11-30 15:45:24.284211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.375 [2024-11-30 15:45:24.284228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.375 [2024-11-30 15:45:24.284286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:12033 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.375 [2024-11-30 15:45:24.284302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.375 #34 NEW cov: 12591 ft: 15196 corp: 23/1765b lim: 100 exec/s: 34 rss: 73Mb L: 85/99 MS: 1 ChangeBinInt- 00:08:16.634 [2024-11-30 15:45:24.344096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.344124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.344181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.344197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.344255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.344272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.344330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8 len:59 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.344346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.634 #35 NEW cov: 12591 ft: 15206 corp: 24/1859b lim: 100 exec/s: 35 rss: 73Mb L: 94/99 MS: 1 CopyPart- 00:08:16.634 [2024-11-30 15:45:24.404129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.404157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.404230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.404249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.404307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.404324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.404382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.404397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.634 #36 NEW cov: 12591 ft: 15227 corp: 25/1946b lim: 100 exec/s: 36 rss: 73Mb L: 87/99 MS: 1 CrossOver- 00:08:16.634 [2024-11-30 15:45:24.444263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.444290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.444365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.444381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.444438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7595718147998050665 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.444455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.444513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.444529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.634 [2024-11-30 15:45:24.444588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.444609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:16.634 #37 NEW cov: 12591 ft: 15285 corp: 26/2046b lim: 100 exec/s: 37 rss: 73Mb L: 100/100 MS: 1 CopyPart- 00:08:16.634 [2024-11-30 15:45:24.483979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.634 [2024-11-30 15:45:24.484007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.635 [2024-11-30 15:45:24.484073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.635 [2024-11-30 15:45:24.484089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.635 [2024-11-30 15:45:24.484149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:201863462912 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.635 [2024-11-30 15:45:24.484164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.635 #38 NEW cov: 12591 ft: 15333 corp: 27/2120b lim: 100 exec/s: 38 rss: 73Mb L: 74/100 MS: 1 EraseBytes- 00:08:16.635 [2024-11-30 15:45:24.523833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.635 [2024-11-30 15:45:24.523863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.635 [2024-11-30 15:45:24.523936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.635 [2024-11-30 15:45:24.523954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.635 #39 NEW cov: 12591 ft: 15352 corp: 28/2174b lim: 100 exec/s: 39 rss: 73Mb L: 54/100 MS: 1 ChangeByte- 00:08:16.635 [2024-11-30 15:45:24.584188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.635 [2024-11-30 15:45:24.584216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.635 [2024-11-30 15:45:24.584289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.635 [2024-11-30 15:45:24.584306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.635 [2024-11-30 15:45:24.584363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.635 [2024-11-30 15:45:24.584379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.635 [2024-11-30 15:45:24.584436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18374403896610062078 len:65279 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.635 [2024-11-30 15:45:24.584453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.894 #40 NEW cov: 12591 ft: 15363 corp: 29/2273b lim: 100 exec/s: 40 rss: 73Mb L: 99/100 MS: 1 ShuffleBytes- 00:08:16.894 [2024-11-30 15:45:24.644241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.644269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.894 [2024-11-30 15:45:24.644337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.644354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.894 [2024-11-30 15:45:24.644410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.644427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.894 [2024-11-30 15:45:24.644485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18374403900871474942 len:65279 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.644502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.894 #41 NEW cov: 12591 ft: 15391 corp: 30/2372b lim: 100 exec/s: 41 rss: 73Mb L: 99/100 MS: 1 ChangeByte- 00:08:16.894 [2024-11-30 15:45:24.684251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1761607680 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.684280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.894 [2024-11-30 15:45:24.684352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.684369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.894 [2024-11-30 15:45:24.684428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.684444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.894 [2024-11-30 15:45:24.684503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:94 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.684518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.894 #42 NEW cov: 12591 ft: 15465 corp: 31/2458b lim: 100 exec/s: 42 rss: 74Mb L: 86/100 MS: 1 InsertByte- 00:08:16.894 [2024-11-30 15:45:24.744252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.744281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.894 [2024-11-30 15:45:24.744355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.894 [2024-11-30 15:45:24.744372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.894 [2024-11-30 15:45:24.744430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.744445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.895 [2024-11-30 15:45:24.744506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.744522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.895 #43 NEW cov: 12591 ft: 15512 corp: 32/2543b lim: 100 exec/s: 43 rss: 74Mb L: 85/100 MS: 1 CrossOver- 00:08:16.895 [2024-11-30 15:45:24.784302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.784331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.895 [2024-11-30 15:45:24.784402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.784419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.895 [2024-11-30 15:45:24.784477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.784493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.895 [2024-11-30 15:45:24.784553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:8 len:59 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.784568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.895 #44 NEW cov: 12591 ft: 15526 corp: 33/2637b lim: 100 exec/s: 44 rss: 74Mb L: 94/100 MS: 1 ChangeASCIIInt- 00:08:16.895 [2024-11-30 15:45:24.844343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.844372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.895 [2024-11-30 15:45:24.844427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18444773753167544319 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.844447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.895 [2024-11-30 15:45:24.844503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7595718147998050665 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.844519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.895 [2024-11-30 15:45:24.844577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.895 [2024-11-30 15:45:24.844591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.154 #45 NEW cov: 12591 ft: 15533 corp: 34/2735b lim: 100 exec/s: 45 rss: 74Mb L: 98/100 MS: 1 ChangeBinInt- 00:08:17.154 [2024-11-30 15:45:24.884443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.884471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.154 [2024-11-30 15:45:24.884528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.884545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.154 [2024-11-30 15:45:24.884608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7595718147998050665 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.884625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.154 [2024-11-30 15:45:24.884683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.884698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.154 [2024-11-30 15:45:24.884756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.884772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.154 #46 NEW cov: 12591 ft: 15584 corp: 35/2835b lim: 100 exec/s: 46 rss: 74Mb L: 100/100 MS: 1 ChangeBit- 00:08:17.154 [2024-11-30 15:45:24.944341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.944370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.154 [2024-11-30 15:45:24.944413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.944430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.154 [2024-11-30 15:45:24.944504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.944524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.154 [2024-11-30 15:45:24.944585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:48 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.154 [2024-11-30 15:45:24.944606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.154 #47 NEW cov: 12591 ft: 15653 corp: 36/2921b lim: 100 exec/s: 23 rss: 74Mb L: 86/100 MS: 1 InsertByte- 00:08:17.154 #47 DONE cov: 12591 ft: 15653 corp: 36/2921b lim: 100 exec/s: 23 rss: 74Mb 00:08:17.154 Done 47 runs in 2 second(s) 00:08:17.154 15:45:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:17.154 15:45:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:17.154 15:45:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.154 15:45:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:17.154 00:08:17.154 real 1m6.670s 00:08:17.154 user 1m39.077s 00:08:17.154 sys 0m8.783s 00:08:17.154 15:45:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:17.154 15:45:25 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:17.154 ************************************ 00:08:17.154 END TEST nvmf_llvm_fuzz 00:08:17.154 ************************************ 00:08:17.154 15:45:25 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:17.154 15:45:25 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:17.154 15:45:25 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:17.154 15:45:25 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:17.154 15:45:25 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:17.154 15:45:25 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:17.413 ************************************ 00:08:17.413 START TEST vfio_llvm_fuzz 00:08:17.413 ************************************ 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:17.413 * Looking for test storage... 00:08:17.413 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:17.413 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:17.413 --rc genhtml_branch_coverage=1 00:08:17.413 --rc genhtml_function_coverage=1 00:08:17.413 --rc genhtml_legend=1 00:08:17.413 --rc geninfo_all_blocks=1 00:08:17.413 --rc geninfo_unexecuted_blocks=1 00:08:17.413 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:17.413 ' 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:17.413 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:17.413 --rc genhtml_branch_coverage=1 00:08:17.413 --rc genhtml_function_coverage=1 00:08:17.413 --rc genhtml_legend=1 00:08:17.413 --rc geninfo_all_blocks=1 00:08:17.413 --rc geninfo_unexecuted_blocks=1 00:08:17.413 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:17.413 ' 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:17.413 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:17.413 --rc genhtml_branch_coverage=1 00:08:17.413 --rc genhtml_function_coverage=1 00:08:17.413 --rc genhtml_legend=1 00:08:17.413 --rc geninfo_all_blocks=1 00:08:17.413 --rc geninfo_unexecuted_blocks=1 00:08:17.413 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:17.413 ' 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:17.413 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:17.413 --rc genhtml_branch_coverage=1 00:08:17.413 --rc genhtml_function_coverage=1 00:08:17.413 --rc genhtml_legend=1 00:08:17.413 --rc geninfo_all_blocks=1 00:08:17.413 --rc geninfo_unexecuted_blocks=1 00:08:17.413 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:17.413 ' 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:17.413 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:17.414 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:17.414 #define SPDK_CONFIG_H 00:08:17.414 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:17.414 #define SPDK_CONFIG_APPS 1 00:08:17.415 #define SPDK_CONFIG_ARCH native 00:08:17.415 #undef SPDK_CONFIG_ASAN 00:08:17.415 #undef SPDK_CONFIG_AVAHI 00:08:17.415 #undef SPDK_CONFIG_CET 00:08:17.415 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:17.415 #define SPDK_CONFIG_COVERAGE 1 00:08:17.415 #define SPDK_CONFIG_CROSS_PREFIX 00:08:17.415 #undef SPDK_CONFIG_CRYPTO 00:08:17.415 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:17.415 #undef SPDK_CONFIG_CUSTOMOCF 00:08:17.415 #undef SPDK_CONFIG_DAOS 00:08:17.415 #define SPDK_CONFIG_DAOS_DIR 00:08:17.415 #define SPDK_CONFIG_DEBUG 1 00:08:17.415 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:17.415 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:17.415 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:17.415 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:17.415 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:17.415 #undef SPDK_CONFIG_DPDK_UADK 00:08:17.415 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:17.415 #define SPDK_CONFIG_EXAMPLES 1 00:08:17.415 #undef SPDK_CONFIG_FC 00:08:17.415 #define SPDK_CONFIG_FC_PATH 00:08:17.415 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:17.415 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:17.415 #define SPDK_CONFIG_FSDEV 1 00:08:17.415 #undef SPDK_CONFIG_FUSE 00:08:17.415 #define SPDK_CONFIG_FUZZER 1 00:08:17.415 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:17.415 #undef SPDK_CONFIG_GOLANG 00:08:17.415 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:17.415 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:17.415 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:17.415 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:17.415 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:17.415 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:17.415 #undef SPDK_CONFIG_HAVE_LZ4 00:08:17.415 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:17.415 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:17.415 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:17.415 #define SPDK_CONFIG_IDXD 1 00:08:17.415 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:17.415 #undef SPDK_CONFIG_IPSEC_MB 00:08:17.415 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:17.415 #define SPDK_CONFIG_ISAL 1 00:08:17.415 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:17.415 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:17.415 #define SPDK_CONFIG_LIBDIR 00:08:17.415 #undef SPDK_CONFIG_LTO 00:08:17.415 #define SPDK_CONFIG_MAX_LCORES 128 00:08:17.415 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:17.415 #define SPDK_CONFIG_NVME_CUSE 1 00:08:17.415 #undef SPDK_CONFIG_OCF 00:08:17.415 #define SPDK_CONFIG_OCF_PATH 00:08:17.415 #define SPDK_CONFIG_OPENSSL_PATH 00:08:17.415 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:17.415 #define SPDK_CONFIG_PGO_DIR 00:08:17.415 #undef SPDK_CONFIG_PGO_USE 00:08:17.415 #define SPDK_CONFIG_PREFIX /usr/local 00:08:17.415 #undef SPDK_CONFIG_RAID5F 00:08:17.415 #undef SPDK_CONFIG_RBD 00:08:17.415 #define SPDK_CONFIG_RDMA 1 00:08:17.415 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:17.415 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:17.415 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:17.415 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:17.415 #undef SPDK_CONFIG_SHARED 00:08:17.415 #undef SPDK_CONFIG_SMA 00:08:17.415 #define SPDK_CONFIG_TESTS 1 00:08:17.415 #undef SPDK_CONFIG_TSAN 00:08:17.415 #define SPDK_CONFIG_UBLK 1 00:08:17.415 #define SPDK_CONFIG_UBSAN 1 00:08:17.415 #undef SPDK_CONFIG_UNIT_TESTS 00:08:17.415 #undef SPDK_CONFIG_URING 00:08:17.415 #define SPDK_CONFIG_URING_PATH 00:08:17.415 #undef SPDK_CONFIG_URING_ZNS 00:08:17.415 #undef SPDK_CONFIG_USDT 00:08:17.415 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:17.415 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:17.415 #define SPDK_CONFIG_VFIO_USER 1 00:08:17.415 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:17.415 #define SPDK_CONFIG_VHOST 1 00:08:17.415 #define SPDK_CONFIG_VIRTIO 1 00:08:17.415 #undef SPDK_CONFIG_VTUNE 00:08:17.415 #define SPDK_CONFIG_VTUNE_DIR 00:08:17.415 #define SPDK_CONFIG_WERROR 1 00:08:17.415 #define SPDK_CONFIG_WPDK_DIR 00:08:17.415 #undef SPDK_CONFIG_XNVME 00:08:17.415 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:17.415 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:17.675 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : main 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:17.676 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 1722638 ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 1722638 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.MylkvY 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.MylkvY/tests/vfio /tmp/spdk.MylkvY 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=51135397888 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730607104 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=10595209216 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30860537856 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865301504 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=4763648 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340129792 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346122240 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5992448 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30863253504 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865305600 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=2052096 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:08:17.677 * Looking for test storage... 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=51135397888 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=12809801728 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:17.677 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:17.678 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:17.678 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:17.678 --rc genhtml_branch_coverage=1 00:08:17.678 --rc genhtml_function_coverage=1 00:08:17.678 --rc genhtml_legend=1 00:08:17.678 --rc geninfo_all_blocks=1 00:08:17.678 --rc geninfo_unexecuted_blocks=1 00:08:17.678 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:17.678 ' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:17.678 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:17.678 --rc genhtml_branch_coverage=1 00:08:17.678 --rc genhtml_function_coverage=1 00:08:17.678 --rc genhtml_legend=1 00:08:17.678 --rc geninfo_all_blocks=1 00:08:17.678 --rc geninfo_unexecuted_blocks=1 00:08:17.678 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:17.678 ' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:17.678 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:17.678 --rc genhtml_branch_coverage=1 00:08:17.678 --rc genhtml_function_coverage=1 00:08:17.678 --rc genhtml_legend=1 00:08:17.678 --rc geninfo_all_blocks=1 00:08:17.678 --rc geninfo_unexecuted_blocks=1 00:08:17.678 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:17.678 ' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:17.678 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:17.678 --rc genhtml_branch_coverage=1 00:08:17.678 --rc genhtml_function_coverage=1 00:08:17.678 --rc genhtml_legend=1 00:08:17.678 --rc geninfo_all_blocks=1 00:08:17.678 --rc geninfo_unexecuted_blocks=1 00:08:17.678 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:17.678 ' 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:17.678 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:17.678 15:45:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:17.937 [2024-11-30 15:45:25.639159] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:17.937 [2024-11-30 15:45:25.639244] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1722797 ] 00:08:17.937 [2024-11-30 15:45:25.775268] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:17.937 [2024-11-30 15:45:25.820378] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.937 [2024-11-30 15:45:25.843083] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.195 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.195 INFO: Seed: 4097463397 00:08:18.195 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:18.195 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:18.195 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:18.195 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.195 #2 INITED exec/s: 0 rss: 66Mb 00:08:18.195 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.195 This may also happen if the target rejected all inputs we tried so far 00:08:18.195 [2024-11-30 15:45:26.073175] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:18.711 NEW_FUNC[1/674]: 0x45e728 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:18.711 NEW_FUNC[2/674]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:18.711 #4 NEW cov: 11140 ft: 11173 corp: 2/7b lim: 6 exec/s: 0 rss: 71Mb L: 6/6 MS: 2 InsertRepeatedBytes-InsertByte- 00:08:18.969 NEW_FUNC[1/1]: 0x482978 in bdev_malloc_writev /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/bdev/malloc/bdev_malloc.c:418 00:08:18.969 #10 NEW cov: 11250 ft: 14801 corp: 3/13b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 CMP- DE: "\377\004"- 00:08:18.969 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:18.969 #16 NEW cov: 11267 ft: 16046 corp: 4/19b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:19.228 #17 NEW cov: 11267 ft: 17373 corp: 5/25b lim: 6 exec/s: 17 rss: 74Mb L: 6/6 MS: 1 ChangeByte- 00:08:19.486 #23 NEW cov: 11274 ft: 17829 corp: 6/31b lim: 6 exec/s: 23 rss: 74Mb L: 6/6 MS: 1 PersAutoDict- DE: "\377\004"- 00:08:19.745 #25 NEW cov: 11274 ft: 18309 corp: 7/37b lim: 6 exec/s: 25 rss: 74Mb L: 6/6 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:19.745 #26 NEW cov: 11274 ft: 18408 corp: 8/43b lim: 6 exec/s: 26 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:08:20.004 #32 NEW cov: 11281 ft: 18527 corp: 9/49b lim: 6 exec/s: 32 rss: 74Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:20.263 #33 NEW cov: 11281 ft: 18682 corp: 10/55b lim: 6 exec/s: 16 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:20.263 #33 DONE cov: 11281 ft: 18682 corp: 10/55b lim: 6 exec/s: 16 rss: 74Mb 00:08:20.263 ###### Recommended dictionary. ###### 00:08:20.263 "\377\004" # Uses: 4 00:08:20.263 ###### End of recommended dictionary. ###### 00:08:20.263 Done 33 runs in 2 second(s) 00:08:20.263 [2024-11-30 15:45:28.068809] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:20.522 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:20.522 15:45:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:20.522 [2024-11-30 15:45:28.316189] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:20.522 [2024-11-30 15:45:28.316258] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1723221 ] 00:08:20.522 [2024-11-30 15:45:28.450672] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:20.781 [2024-11-30 15:45:28.496508] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.781 [2024-11-30 15:45:28.520323] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.781 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.781 INFO: Seed: 2479494331 00:08:20.781 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:20.781 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:20.781 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:20.781 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.781 #2 INITED exec/s: 0 rss: 66Mb 00:08:20.781 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.781 This may also happen if the target rejected all inputs we tried so far 00:08:21.039 [2024-11-30 15:45:28.750046] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:21.039 [2024-11-30 15:45:28.804653] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:21.039 [2024-11-30 15:45:28.804678] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:21.039 [2024-11-30 15:45:28.804696] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:21.297 NEW_FUNC[1/678]: 0x45ecc8 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:21.297 NEW_FUNC[2/678]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:21.297 #21 NEW cov: 11238 ft: 11210 corp: 2/5b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 4 ChangeByte-ChangeBit-InsertByte-CopyPart- 00:08:21.556 [2024-11-30 15:45:29.288422] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:21.556 [2024-11-30 15:45:29.288455] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:21.556 [2024-11-30 15:45:29.288473] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:21.556 #47 NEW cov: 11252 ft: 14501 corp: 3/9b lim: 4 exec/s: 0 rss: 76Mb L: 4/4 MS: 1 ChangeByte- 00:08:21.556 [2024-11-30 15:45:29.485660] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:21.556 [2024-11-30 15:45:29.485683] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:21.556 [2024-11-30 15:45:29.485700] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:21.815 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:21.815 #50 NEW cov: 11269 ft: 15178 corp: 4/13b lim: 4 exec/s: 0 rss: 77Mb L: 4/4 MS: 3 CrossOver-ChangeBit-InsertByte- 00:08:21.815 [2024-11-30 15:45:29.688346] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:21.815 [2024-11-30 15:45:29.688369] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:21.815 [2024-11-30 15:45:29.688388] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:22.073 #51 NEW cov: 11269 ft: 15523 corp: 5/17b lim: 4 exec/s: 51 rss: 77Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:22.073 [2024-11-30 15:45:29.878095] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:22.073 [2024-11-30 15:45:29.878119] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:22.074 [2024-11-30 15:45:29.878138] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:22.074 #52 NEW cov: 11269 ft: 16744 corp: 6/21b lim: 4 exec/s: 52 rss: 77Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:22.332 [2024-11-30 15:45:30.082840] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:22.332 [2024-11-30 15:45:30.082872] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:22.332 [2024-11-30 15:45:30.082891] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:22.332 #53 NEW cov: 11269 ft: 16939 corp: 7/25b lim: 4 exec/s: 53 rss: 77Mb L: 4/4 MS: 1 CrossOver- 00:08:22.332 [2024-11-30 15:45:30.277555] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:22.332 [2024-11-30 15:45:30.277583] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:22.332 [2024-11-30 15:45:30.277603] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:22.590 #54 NEW cov: 11269 ft: 17536 corp: 8/29b lim: 4 exec/s: 54 rss: 77Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:22.590 [2024-11-30 15:45:30.471559] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:22.590 [2024-11-30 15:45:30.471582] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:22.590 [2024-11-30 15:45:30.471604] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:22.848 #55 NEW cov: 11276 ft: 17819 corp: 9/33b lim: 4 exec/s: 55 rss: 77Mb L: 4/4 MS: 1 CrossOver- 00:08:22.848 [2024-11-30 15:45:30.659246] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:22.848 [2024-11-30 15:45:30.659274] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:22.848 [2024-11-30 15:45:30.659292] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:22.848 #56 NEW cov: 11276 ft: 18214 corp: 10/37b lim: 4 exec/s: 28 rss: 77Mb L: 4/4 MS: 1 CopyPart- 00:08:22.848 #56 DONE cov: 11276 ft: 18214 corp: 10/37b lim: 4 exec/s: 28 rss: 77Mb 00:08:22.848 Done 56 runs in 2 second(s) 00:08:22.848 [2024-11-30 15:45:30.793802] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:23.107 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:23.107 15:45:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:23.107 [2024-11-30 15:45:31.044832] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:23.107 [2024-11-30 15:45:31.044894] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1723624 ] 00:08:23.366 [2024-11-30 15:45:31.179102] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:23.366 [2024-11-30 15:45:31.224785] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.366 [2024-11-30 15:45:31.247675] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.624 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.624 INFO: Seed: 913545966 00:08:23.624 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:23.624 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:23.624 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:23.624 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.624 #2 INITED exec/s: 0 rss: 68Mb 00:08:23.624 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.624 This may also happen if the target rejected all inputs we tried so far 00:08:23.624 [2024-11-30 15:45:31.482997] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:23.624 [2024-11-30 15:45:31.535113] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:24.142 NEW_FUNC[1/677]: 0x45f6b8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:08:24.142 NEW_FUNC[2/677]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:24.142 #6 NEW cov: 11188 ft: 10958 corp: 2/9b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 4 ChangeBit-ChangeBit-InsertRepeatedBytes-CopyPart- 00:08:24.142 [2024-11-30 15:45:31.994038] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:24.142 #16 NEW cov: 11232 ft: 14239 corp: 3/17b lim: 8 exec/s: 0 rss: 76Mb L: 8/8 MS: 5 EraseBytes-ChangeBinInt-ChangeBit-ChangeByte-CrossOver- 00:08:24.401 [2024-11-30 15:45:32.167329] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:24.401 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:24.401 #21 NEW cov: 11252 ft: 15140 corp: 4/25b lim: 8 exec/s: 0 rss: 77Mb L: 8/8 MS: 5 CopyPart-CrossOver-CrossOver-InsertByte-CrossOver- 00:08:24.659 [2024-11-30 15:45:32.367539] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:24.659 #22 NEW cov: 11252 ft: 16282 corp: 5/33b lim: 8 exec/s: 22 rss: 77Mb L: 8/8 MS: 1 CopyPart- 00:08:24.659 [2024-11-30 15:45:32.546475] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:24.919 #23 NEW cov: 11252 ft: 17125 corp: 6/41b lim: 8 exec/s: 23 rss: 77Mb L: 8/8 MS: 1 CopyPart- 00:08:24.919 [2024-11-30 15:45:32.725299] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:24.919 #24 NEW cov: 11252 ft: 17550 corp: 7/49b lim: 8 exec/s: 24 rss: 77Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:25.177 [2024-11-30 15:45:32.905285] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:25.177 #25 NEW cov: 11252 ft: 17698 corp: 8/57b lim: 8 exec/s: 25 rss: 77Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:25.177 [2024-11-30 15:45:33.078447] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:25.436 #31 NEW cov: 11252 ft: 17822 corp: 9/65b lim: 8 exec/s: 31 rss: 77Mb L: 8/8 MS: 1 CrossOver- 00:08:25.436 [2024-11-30 15:45:33.250288] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:25.436 #32 NEW cov: 11259 ft: 17849 corp: 10/73b lim: 8 exec/s: 32 rss: 77Mb L: 8/8 MS: 1 ChangeBit- 00:08:25.695 [2024-11-30 15:45:33.422624] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:25.695 #38 NEW cov: 11259 ft: 17863 corp: 11/81b lim: 8 exec/s: 19 rss: 77Mb L: 8/8 MS: 1 ChangeByte- 00:08:25.695 #38 DONE cov: 11259 ft: 17863 corp: 11/81b lim: 8 exec/s: 19 rss: 77Mb 00:08:25.695 Done 38 runs in 2 second(s) 00:08:25.695 [2024-11-30 15:45:33.549797] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:25.956 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:25.956 15:45:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:25.956 [2024-11-30 15:45:33.792178] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:25.956 [2024-11-30 15:45:33.792232] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1724162 ] 00:08:26.215 [2024-11-30 15:45:33.925739] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:26.215 [2024-11-30 15:45:33.969436] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.215 [2024-11-30 15:45:33.991458] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.215 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.215 INFO: Seed: 3653534272 00:08:26.473 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:26.473 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:26.473 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:26.473 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.473 #2 INITED exec/s: 0 rss: 66Mb 00:08:26.473 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.473 This may also happen if the target rejected all inputs we tried so far 00:08:26.473 [2024-11-30 15:45:34.222341] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:08:26.732 NEW_FUNC[1/677]: 0x45fda8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:08:26.732 NEW_FUNC[2/677]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:26.732 #257 NEW cov: 11227 ft: 11194 corp: 2/33b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 5 CopyPart-ChangeBinInt-InsertRepeatedBytes-ChangeBinInt-CopyPart- 00:08:26.990 #288 NEW cov: 11243 ft: 14492 corp: 3/65b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 CrossOver- 00:08:27.249 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:27.249 #289 NEW cov: 11260 ft: 14787 corp: 4/97b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeBit- 00:08:27.249 #290 NEW cov: 11260 ft: 15607 corp: 5/129b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:08:27.507 #291 NEW cov: 11260 ft: 16264 corp: 6/161b lim: 32 exec/s: 291 rss: 73Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:27.766 #292 NEW cov: 11260 ft: 16716 corp: 7/193b lim: 32 exec/s: 292 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:08:27.766 #322 NEW cov: 11260 ft: 16887 corp: 8/225b lim: 32 exec/s: 322 rss: 73Mb L: 32/32 MS: 5 ShuffleBytes-InsertRepeatedBytes-CopyPart-InsertByte-InsertRepeatedBytes- 00:08:28.025 #323 NEW cov: 11260 ft: 17213 corp: 9/257b lim: 32 exec/s: 323 rss: 73Mb L: 32/32 MS: 1 CopyPart- 00:08:28.283 #324 NEW cov: 11267 ft: 17725 corp: 10/289b lim: 32 exec/s: 324 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:08:28.283 #325 NEW cov: 11267 ft: 18137 corp: 11/321b lim: 32 exec/s: 162 rss: 73Mb L: 32/32 MS: 1 ChangeBit- 00:08:28.283 #325 DONE cov: 11267 ft: 18137 corp: 11/321b lim: 32 exec/s: 162 rss: 73Mb 00:08:28.283 Done 325 runs in 2 second(s) 00:08:28.543 [2024-11-30 15:45:36.249802] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:28.543 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:28.543 15:45:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:28.543 [2024-11-30 15:45:36.493231] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:28.543 [2024-11-30 15:45:36.493287] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1724696 ] 00:08:28.802 [2024-11-30 15:45:36.626420] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:28.802 [2024-11-30 15:45:36.669842] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.802 [2024-11-30 15:45:36.692007] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.061 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.061 INFO: Seed: 2061562095 00:08:29.061 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:29.061 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:29.061 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:29.061 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.061 #2 INITED exec/s: 0 rss: 67Mb 00:08:29.061 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.061 This may also happen if the target rejected all inputs we tried so far 00:08:29.061 [2024-11-30 15:45:36.925326] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:08:29.578 NEW_FUNC[1/677]: 0x460628 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:08:29.578 NEW_FUNC[2/677]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:29.578 #52 NEW cov: 11224 ft: 10991 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 5 InsertRepeatedBytes-ChangeBinInt-CopyPart-ChangeBit-CopyPart- 00:08:29.578 #58 NEW cov: 11241 ft: 14668 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:29.837 #59 NEW cov: 11241 ft: 15674 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:29.837 #60 NEW cov: 11241 ft: 15902 corp: 5/129b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:29.837 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:29.837 #61 NEW cov: 11258 ft: 16117 corp: 6/161b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:30.096 #66 NEW cov: 11258 ft: 16835 corp: 7/193b lim: 32 exec/s: 66 rss: 75Mb L: 32/32 MS: 5 CopyPart-ShuffleBytes-CMP-InsertRepeatedBytes-InsertByte- DE: "\200\000\000\000"- 00:08:30.096 #67 NEW cov: 11258 ft: 16927 corp: 8/225b lim: 32 exec/s: 67 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:08:30.355 #73 NEW cov: 11258 ft: 16987 corp: 9/257b lim: 32 exec/s: 73 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:08:30.355 #74 NEW cov: 11258 ft: 17364 corp: 10/289b lim: 32 exec/s: 74 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:08:30.614 #75 NEW cov: 11258 ft: 17644 corp: 11/321b lim: 32 exec/s: 75 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:30.614 #76 NEW cov: 11258 ft: 17727 corp: 12/353b lim: 32 exec/s: 76 rss: 75Mb L: 32/32 MS: 1 PersAutoDict- DE: "\200\000\000\000"- 00:08:30.873 #77 NEW cov: 11258 ft: 18233 corp: 13/385b lim: 32 exec/s: 77 rss: 75Mb L: 32/32 MS: 1 PersAutoDict- DE: "\200\000\000\000"- 00:08:30.873 #78 NEW cov: 11265 ft: 18410 corp: 14/417b lim: 32 exec/s: 78 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:08:30.873 #79 NEW cov: 11265 ft: 18521 corp: 15/449b lim: 32 exec/s: 79 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:31.132 #80 NEW cov: 11265 ft: 18540 corp: 16/481b lim: 32 exec/s: 40 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:31.132 #80 DONE cov: 11265 ft: 18540 corp: 16/481b lim: 32 exec/s: 40 rss: 75Mb 00:08:31.132 ###### Recommended dictionary. ###### 00:08:31.132 "\200\000\000\000" # Uses: 2 00:08:31.132 ###### End of recommended dictionary. ###### 00:08:31.132 Done 80 runs in 2 second(s) 00:08:31.132 [2024-11-30 15:45:38.972821] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:31.391 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:31.391 15:45:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:31.391 [2024-11-30 15:45:39.231275] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:31.391 [2024-11-30 15:45:39.231367] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1725155 ] 00:08:31.650 [2024-11-30 15:45:39.367905] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:31.650 [2024-11-30 15:45:39.412970] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.650 [2024-11-30 15:45:39.435365] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.650 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.650 INFO: Seed: 507608410 00:08:31.909 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:31.909 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:31.909 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:31.909 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.909 #2 INITED exec/s: 0 rss: 66Mb 00:08:31.909 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.909 This may also happen if the target rejected all inputs we tried so far 00:08:31.909 [2024-11-30 15:45:39.666499] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:08:31.909 [2024-11-30 15:45:39.703636] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:31.909 [2024-11-30 15:45:39.703706] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.167 NEW_FUNC[1/678]: 0x461028 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:08:32.167 NEW_FUNC[2/678]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:32.167 #11 NEW cov: 11226 ft: 11205 corp: 2/14b lim: 13 exec/s: 0 rss: 72Mb L: 13/13 MS: 4 InsertRepeatedBytes-ShuffleBytes-CMP-CopyPart- DE: "\027\000\000\000"- 00:08:32.425 [2024-11-30 15:45:40.184532] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.425 [2024-11-30 15:45:40.184581] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.425 #12 NEW cov: 11254 ft: 14364 corp: 3/27b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 1 PersAutoDict- DE: "\027\000\000\000"- 00:08:32.425 [2024-11-30 15:45:40.363439] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.425 [2024-11-30 15:45:40.363472] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.684 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:32.684 #15 NEW cov: 11271 ft: 15114 corp: 4/40b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 3 ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:08:32.684 [2024-11-30 15:45:40.550678] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.684 [2024-11-30 15:45:40.550710] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.942 #16 NEW cov: 11271 ft: 15727 corp: 5/53b lim: 13 exec/s: 16 rss: 74Mb L: 13/13 MS: 1 PersAutoDict- DE: "\027\000\000\000"- 00:08:32.942 [2024-11-30 15:45:40.728804] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.942 [2024-11-30 15:45:40.728836] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:32.942 #17 NEW cov: 11271 ft: 16040 corp: 6/66b lim: 13 exec/s: 17 rss: 74Mb L: 13/13 MS: 1 ChangeByte- 00:08:32.942 [2024-11-30 15:45:40.907027] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:32.942 [2024-11-30 15:45:40.907058] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.201 #23 NEW cov: 11271 ft: 16360 corp: 7/79b lim: 13 exec/s: 23 rss: 74Mb L: 13/13 MS: 1 CMP- DE: "\335\374\034\320\334\306\224\000"- 00:08:33.201 [2024-11-30 15:45:41.083450] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.201 [2024-11-30 15:45:41.083480] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.460 #29 NEW cov: 11271 ft: 16402 corp: 8/92b lim: 13 exec/s: 29 rss: 74Mb L: 13/13 MS: 1 ShuffleBytes- 00:08:33.460 [2024-11-30 15:45:41.259318] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.460 [2024-11-30 15:45:41.259348] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.460 #40 NEW cov: 11271 ft: 16424 corp: 9/105b lim: 13 exec/s: 40 rss: 75Mb L: 13/13 MS: 1 CopyPart- 00:08:33.718 [2024-11-30 15:45:41.434068] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.718 [2024-11-30 15:45:41.434099] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.718 #46 NEW cov: 11278 ft: 16818 corp: 10/118b lim: 13 exec/s: 46 rss: 75Mb L: 13/13 MS: 1 ChangeBit- 00:08:33.718 [2024-11-30 15:45:41.596901] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:33.718 [2024-11-30 15:45:41.596932] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:33.977 #52 NEW cov: 11278 ft: 17707 corp: 11/131b lim: 13 exec/s: 26 rss: 75Mb L: 13/13 MS: 1 ChangeBit- 00:08:33.977 #52 DONE cov: 11278 ft: 17707 corp: 11/131b lim: 13 exec/s: 26 rss: 75Mb 00:08:33.977 ###### Recommended dictionary. ###### 00:08:33.977 "\027\000\000\000" # Uses: 3 00:08:33.977 "\335\374\034\320\334\306\224\000" # Uses: 0 00:08:33.977 ###### End of recommended dictionary. ###### 00:08:33.977 Done 52 runs in 2 second(s) 00:08:33.977 [2024-11-30 15:45:41.718788] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:33.977 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:33.977 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:34.236 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:34.236 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:34.236 15:45:41 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:34.236 [2024-11-30 15:45:41.961386] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:34.236 [2024-11-30 15:45:41.961440] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1725543 ] 00:08:34.236 [2024-11-30 15:45:42.094261] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:34.236 [2024-11-30 15:45:42.139303] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.236 [2024-11-30 15:45:42.161643] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.496 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.496 INFO: Seed: 3236597973 00:08:34.496 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:34.496 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:34.496 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:34.496 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.496 #2 INITED exec/s: 0 rss: 66Mb 00:08:34.496 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.496 This may also happen if the target rejected all inputs we tried so far 00:08:34.496 [2024-11-30 15:45:42.403282] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:08:34.496 [2024-11-30 15:45:42.446646] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:34.496 [2024-11-30 15:45:42.446678] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:35.014 NEW_FUNC[1/678]: 0x461d18 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:35.014 NEW_FUNC[2/678]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:35.014 #22 NEW cov: 11228 ft: 11183 corp: 2/10b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 5 ShuffleBytes-InsertRepeatedBytes-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:35.014 [2024-11-30 15:45:42.903592] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:35.014 [2024-11-30 15:45:42.903638] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:35.273 #32 NEW cov: 11242 ft: 14485 corp: 3/19b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 5 InsertRepeatedBytes-CrossOver-ChangeBinInt-CopyPart-InsertByte- 00:08:35.273 [2024-11-30 15:45:43.084038] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:35.273 [2024-11-30 15:45:43.084069] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:35.273 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:35.273 #38 NEW cov: 11259 ft: 15404 corp: 4/28b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:35.531 [2024-11-30 15:45:43.254777] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:35.531 [2024-11-30 15:45:43.254808] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:35.531 #39 NEW cov: 11259 ft: 15742 corp: 5/37b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:35.531 [2024-11-30 15:45:43.429633] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:35.532 [2024-11-30 15:45:43.429663] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:35.790 #40 NEW cov: 11259 ft: 15919 corp: 6/46b lim: 9 exec/s: 40 rss: 76Mb L: 9/9 MS: 1 ChangeBit- 00:08:35.790 [2024-11-30 15:45:43.603499] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:35.790 [2024-11-30 15:45:43.603530] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:35.790 #44 NEW cov: 11259 ft: 16083 corp: 7/55b lim: 9 exec/s: 44 rss: 76Mb L: 9/9 MS: 4 EraseBytes-EraseBytes-InsertRepeatedBytes-InsertByte- 00:08:36.049 [2024-11-30 15:45:43.782792] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:36.049 [2024-11-30 15:45:43.782822] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:36.049 #45 NEW cov: 11259 ft: 17168 corp: 8/64b lim: 9 exec/s: 45 rss: 76Mb L: 9/9 MS: 1 ChangeByte- 00:08:36.049 [2024-11-30 15:45:43.968861] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:36.049 [2024-11-30 15:45:43.968892] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:36.308 #46 NEW cov: 11259 ft: 17712 corp: 9/73b lim: 9 exec/s: 46 rss: 76Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:36.308 [2024-11-30 15:45:44.138940] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:36.308 [2024-11-30 15:45:44.138971] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:36.308 #47 NEW cov: 11266 ft: 17785 corp: 10/82b lim: 9 exec/s: 47 rss: 76Mb L: 9/9 MS: 1 CopyPart- 00:08:36.567 [2024-11-30 15:45:44.309548] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:36.567 [2024-11-30 15:45:44.309578] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:36.567 #48 NEW cov: 11266 ft: 17830 corp: 11/91b lim: 9 exec/s: 24 rss: 76Mb L: 9/9 MS: 1 CopyPart- 00:08:36.567 #48 DONE cov: 11266 ft: 17830 corp: 11/91b lim: 9 exec/s: 24 rss: 76Mb 00:08:36.567 Done 48 runs in 2 second(s) 00:08:36.567 [2024-11-30 15:45:44.432799] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:08:36.826 15:45:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:08:36.826 15:45:44 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:36.826 15:45:44 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.826 15:45:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:08:36.826 00:08:36.826 real 0m19.499s 00:08:36.826 user 0m26.391s 00:08:36.826 sys 0m1.814s 00:08:36.826 15:45:44 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:36.826 15:45:44 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:36.826 ************************************ 00:08:36.826 END TEST vfio_llvm_fuzz 00:08:36.826 ************************************ 00:08:36.826 00:08:36.826 real 1m26.526s 00:08:36.826 user 2m5.653s 00:08:36.826 sys 0m10.800s 00:08:36.826 15:45:44 llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:36.826 15:45:44 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:36.826 ************************************ 00:08:36.826 END TEST llvm_fuzz 00:08:36.826 ************************************ 00:08:36.826 15:45:44 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:08:36.826 15:45:44 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:08:36.826 15:45:44 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:08:36.826 15:45:44 -- common/autotest_common.sh@726 -- # xtrace_disable 00:08:36.826 15:45:44 -- common/autotest_common.sh@10 -- # set +x 00:08:36.826 15:45:44 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:08:36.826 15:45:44 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:08:36.826 15:45:44 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:08:36.826 15:45:44 -- common/autotest_common.sh@10 -- # set +x 00:08:43.391 INFO: APP EXITING 00:08:43.391 INFO: killing all VMs 00:08:43.391 INFO: killing vhost app 00:08:43.391 INFO: EXIT DONE 00:08:45.928 Waiting for block devices as requested 00:08:45.928 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:46.186 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:46.186 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:46.186 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:46.445 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:46.445 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:46.445 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:46.704 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:46.704 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:46.704 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:46.704 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:46.964 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:46.964 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:46.964 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:47.223 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:47.223 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:47.481 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:50.943 Cleaning 00:08:50.943 Removing: /dev/shm/spdk_tgt_trace.pid1695383 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1692901 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1694165 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1695383 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1696089 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1697038 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1697234 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1698310 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1698486 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1698905 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1699290 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1699677 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1700024 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1700359 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1700650 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1700845 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1701175 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1702108 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1705280 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1705590 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1705895 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1706146 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1706718 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1706892 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1707303 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1707567 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1707859 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1707881 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1708171 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1708358 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1708816 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1709099 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1709385 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1709630 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1710239 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1710762 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1711296 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1711805 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1712128 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1712655 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1713191 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1713566 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1714021 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1714559 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1715054 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1715387 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1715912 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1716451 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1716913 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1717273 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1717913 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1718583 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1719267 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1719733 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1720273 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1720704 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1721103 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1721637 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1722172 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1722797 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1723221 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1723624 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1724162 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1724696 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1725155 00:08:50.943 Removing: /var/run/dpdk/spdk_pid1725543 00:08:50.943 Clean 00:08:51.202 15:45:58 -- common/autotest_common.sh@1453 -- # return 0 00:08:51.202 15:45:58 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:08:51.202 15:45:58 -- common/autotest_common.sh@732 -- # xtrace_disable 00:08:51.202 15:45:58 -- common/autotest_common.sh@10 -- # set +x 00:08:51.202 15:45:59 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:08:51.202 15:45:59 -- common/autotest_common.sh@732 -- # xtrace_disable 00:08:51.202 15:45:59 -- common/autotest_common.sh@10 -- # set +x 00:08:51.202 15:45:59 -- spdk/autotest.sh@392 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:51.202 15:45:59 -- spdk/autotest.sh@394 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:51.202 15:45:59 -- spdk/autotest.sh@394 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:51.202 15:45:59 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:08:51.202 15:45:59 -- spdk/autotest.sh@398 -- # hostname 00:08:51.202 15:45:59 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:08:51.461 geninfo: WARNING: invalid characters removed from testname! 00:08:58.027 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:08:58.027 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:04.607 15:46:11 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:11.169 15:46:18 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:16.436 15:46:24 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:21.720 15:46:29 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:28.286 15:46:34 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:32.481 15:46:40 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:37.749 15:46:45 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:37.749 15:46:45 -- spdk/autorun.sh@1 -- $ timing_finish 00:09:37.749 15:46:45 -- common/autotest_common.sh@738 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt ]] 00:09:37.749 15:46:45 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:37.749 15:46:45 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:09:37.749 15:46:45 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:37.749 + [[ -n 1565532 ]] 00:09:37.749 + sudo kill 1565532 00:09:37.758 [Pipeline] } 00:09:37.773 [Pipeline] // stage 00:09:37.779 [Pipeline] } 00:09:37.794 [Pipeline] // timeout 00:09:37.799 [Pipeline] } 00:09:37.813 [Pipeline] // catchError 00:09:37.818 [Pipeline] } 00:09:37.834 [Pipeline] // wrap 00:09:37.841 [Pipeline] } 00:09:37.855 [Pipeline] // catchError 00:09:37.867 [Pipeline] stage 00:09:37.869 [Pipeline] { (Epilogue) 00:09:37.883 [Pipeline] catchError 00:09:37.885 [Pipeline] { 00:09:37.898 [Pipeline] echo 00:09:37.900 Cleanup processes 00:09:37.906 [Pipeline] sh 00:09:38.192 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:38.192 1733976 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:38.206 [Pipeline] sh 00:09:38.491 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:38.491 ++ grep -v 'sudo pgrep' 00:09:38.491 ++ awk '{print $1}' 00:09:38.491 + sudo kill -9 00:09:38.491 + true 00:09:38.505 [Pipeline] sh 00:09:38.789 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:38.789 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:38.789 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:40.164 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:52.444 [Pipeline] sh 00:09:52.726 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:52.726 Artifacts sizes are good 00:09:52.740 [Pipeline] archiveArtifacts 00:09:52.748 Archiving artifacts 00:09:52.925 [Pipeline] sh 00:09:53.258 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:53.273 [Pipeline] cleanWs 00:09:53.283 [WS-CLEANUP] Deleting project workspace... 00:09:53.283 [WS-CLEANUP] Deferred wipeout is used... 00:09:53.290 [WS-CLEANUP] done 00:09:53.292 [Pipeline] } 00:09:53.308 [Pipeline] // catchError 00:09:53.320 [Pipeline] sh 00:09:53.609 + logger -p user.info -t JENKINS-CI 00:09:53.615 [Pipeline] } 00:09:53.623 [Pipeline] // stage 00:09:53.628 [Pipeline] } 00:09:53.637 [Pipeline] // node 00:09:53.640 [Pipeline] End of Pipeline 00:09:53.678 Finished: SUCCESS