00:00:00.000 Started by upstream project "autotest-spdk-master-vs-dpdk-v22.11" build number 2434 00:00:00.000 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3699 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.080 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.081 The recommended git tool is: git 00:00:00.081 using credential 00000000-0000-0000-0000-000000000002 00:00:00.083 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.124 Fetching changes from the remote Git repository 00:00:00.128 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.172 Using shallow fetch with depth 1 00:00:00.172 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.172 > git --version # timeout=10 00:00:00.212 > git --version # 'git version 2.39.2' 00:00:00.212 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.242 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.242 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:11.592 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:11.603 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:11.614 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:11.615 > git config core.sparsecheckout # timeout=10 00:00:11.625 > git read-tree -mu HEAD # timeout=10 00:00:11.640 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:11.662 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:11.662 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:11.791 [Pipeline] Start of Pipeline 00:00:11.805 [Pipeline] library 00:00:11.807 Loading library shm_lib@master 00:00:11.807 Library shm_lib@master is cached. Copying from home. 00:00:11.822 [Pipeline] node 00:00:11.847 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:11.848 [Pipeline] { 00:00:11.859 [Pipeline] catchError 00:00:11.860 [Pipeline] { 00:00:11.870 [Pipeline] wrap 00:00:11.876 [Pipeline] { 00:00:11.892 [Pipeline] stage 00:00:11.895 [Pipeline] { (Prologue) 00:00:12.091 [Pipeline] sh 00:00:12.870 + logger -p user.info -t JENKINS-CI 00:00:12.897 [Pipeline] echo 00:00:12.898 Node: WFP20 00:00:12.905 [Pipeline] sh 00:00:13.253 [Pipeline] setCustomBuildProperty 00:00:13.264 [Pipeline] echo 00:00:13.266 Cleanup processes 00:00:13.272 [Pipeline] sh 00:00:13.570 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:13.570 5272 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:13.584 [Pipeline] sh 00:00:13.877 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:13.877 ++ grep -v 'sudo pgrep' 00:00:13.877 ++ awk '{print $1}' 00:00:13.877 + sudo kill -9 00:00:13.877 + true 00:00:13.894 [Pipeline] cleanWs 00:00:13.904 [WS-CLEANUP] Deleting project workspace... 00:00:13.904 [WS-CLEANUP] Deferred wipeout is used... 00:00:13.916 [WS-CLEANUP] done 00:00:13.921 [Pipeline] setCustomBuildProperty 00:00:13.935 [Pipeline] sh 00:00:14.220 + sudo git config --global --replace-all safe.directory '*' 00:00:14.288 [Pipeline] httpRequest 00:00:16.295 [Pipeline] echo 00:00:16.297 Sorcerer 10.211.164.20 is alive 00:00:16.307 [Pipeline] retry 00:00:16.309 [Pipeline] { 00:00:16.323 [Pipeline] httpRequest 00:00:16.329 HttpMethod: GET 00:00:16.329 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:16.330 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:16.354 Response Code: HTTP/1.1 200 OK 00:00:16.355 Success: Status code 200 is in the accepted range: 200,404 00:00:16.355 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:47.748 [Pipeline] } 00:00:47.766 [Pipeline] // retry 00:00:47.774 [Pipeline] sh 00:00:48.072 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:48.093 [Pipeline] httpRequest 00:00:48.474 [Pipeline] echo 00:00:48.476 Sorcerer 10.211.164.20 is alive 00:00:48.494 [Pipeline] retry 00:00:48.496 [Pipeline] { 00:00:48.509 [Pipeline] httpRequest 00:00:48.513 HttpMethod: GET 00:00:48.514 URL: http://10.211.164.20/packages/spdk_8d3947977640da882a3cdcc21a7575115b7e7787.tar.gz 00:00:48.515 Sending request to url: http://10.211.164.20/packages/spdk_8d3947977640da882a3cdcc21a7575115b7e7787.tar.gz 00:00:48.525 Response Code: HTTP/1.1 200 OK 00:00:48.526 Success: Status code 200 is in the accepted range: 200,404 00:00:48.526 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_8d3947977640da882a3cdcc21a7575115b7e7787.tar.gz 00:01:52.873 [Pipeline] } 00:01:52.891 [Pipeline] // retry 00:01:52.898 [Pipeline] sh 00:01:53.194 + tar --no-same-owner -xf spdk_8d3947977640da882a3cdcc21a7575115b7e7787.tar.gz 00:01:55.763 [Pipeline] sh 00:01:56.055 + git -C spdk log --oneline -n5 00:01:56.055 8d3947977 spdk_dd: simplify `io_uring_peek_cqe` return code processing 00:01:56.055 77ee034c7 bdev/nvme: Add lock to unprotected operations around attach controller 00:01:56.055 48454bb28 bdev/nvme: Add lock to unprotected operations around detach controller 00:01:56.055 4b59d7893 bdev/nvme: Use nbdev always for local nvme_bdev pointer variables 00:01:56.055 e56f1618f lib/ftl: Add explicit support for write unit sizes of base device 00:01:56.075 [Pipeline] withCredentials 00:01:56.087 > git --version # timeout=10 00:01:56.097 > git --version # 'git version 2.39.2' 00:01:56.121 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:56.123 [Pipeline] { 00:01:56.132 [Pipeline] retry 00:01:56.134 [Pipeline] { 00:01:56.148 [Pipeline] sh 00:01:56.679 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:56.955 [Pipeline] } 00:01:56.972 [Pipeline] // retry 00:01:56.977 [Pipeline] } 00:01:56.994 [Pipeline] // withCredentials 00:01:57.004 [Pipeline] httpRequest 00:01:57.807 [Pipeline] echo 00:01:57.809 Sorcerer 10.211.164.20 is alive 00:01:57.818 [Pipeline] retry 00:01:57.820 [Pipeline] { 00:01:57.833 [Pipeline] httpRequest 00:01:57.838 HttpMethod: GET 00:01:57.838 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:57.839 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:57.843 Response Code: HTTP/1.1 200 OK 00:01:57.843 Success: Status code 200 is in the accepted range: 200,404 00:01:57.844 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:06.122 [Pipeline] } 00:02:06.140 [Pipeline] // retry 00:02:06.148 [Pipeline] sh 00:02:06.440 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:07.842 [Pipeline] sh 00:02:08.134 + git -C dpdk log --oneline -n5 00:02:08.134 caf0f5d395 version: 22.11.4 00:02:08.134 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:02:08.134 dc9c799c7d vhost: fix missing spinlock unlock 00:02:08.134 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:02:08.134 6ef77f2a5e net/gve: fix RX buffer size alignment 00:02:08.146 [Pipeline] } 00:02:08.160 [Pipeline] // stage 00:02:08.169 [Pipeline] stage 00:02:08.171 [Pipeline] { (Prepare) 00:02:08.190 [Pipeline] writeFile 00:02:08.205 [Pipeline] sh 00:02:08.495 + logger -p user.info -t JENKINS-CI 00:02:08.509 [Pipeline] sh 00:02:08.800 + logger -p user.info -t JENKINS-CI 00:02:08.813 [Pipeline] sh 00:02:09.104 + cat autorun-spdk.conf 00:02:09.104 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:09.104 SPDK_TEST_FUZZER_SHORT=1 00:02:09.104 SPDK_TEST_FUZZER=1 00:02:09.104 SPDK_TEST_SETUP=1 00:02:09.104 SPDK_RUN_UBSAN=1 00:02:09.104 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:09.104 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:09.112 RUN_NIGHTLY=1 00:02:09.116 [Pipeline] readFile 00:02:09.149 [Pipeline] withEnv 00:02:09.151 [Pipeline] { 00:02:09.162 [Pipeline] sh 00:02:09.452 + set -ex 00:02:09.453 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:02:09.453 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:09.453 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:09.453 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:09.453 ++ SPDK_TEST_FUZZER=1 00:02:09.453 ++ SPDK_TEST_SETUP=1 00:02:09.453 ++ SPDK_RUN_UBSAN=1 00:02:09.453 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:09.453 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:09.453 ++ RUN_NIGHTLY=1 00:02:09.453 + case $SPDK_TEST_NVMF_NICS in 00:02:09.453 + DRIVERS= 00:02:09.453 + [[ -n '' ]] 00:02:09.453 + exit 0 00:02:09.464 [Pipeline] } 00:02:09.476 [Pipeline] // withEnv 00:02:09.481 [Pipeline] } 00:02:09.491 [Pipeline] // stage 00:02:09.500 [Pipeline] catchError 00:02:09.502 [Pipeline] { 00:02:09.517 [Pipeline] timeout 00:02:09.517 Timeout set to expire in 30 min 00:02:09.519 [Pipeline] { 00:02:09.533 [Pipeline] stage 00:02:09.535 [Pipeline] { (Tests) 00:02:09.548 [Pipeline] sh 00:02:09.840 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:09.840 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:09.840 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:02:09.840 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:02:09.840 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:09.840 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:09.840 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:02:09.840 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:09.840 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:09.840 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:09.840 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:02:09.840 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:09.840 + source /etc/os-release 00:02:09.840 ++ NAME='Fedora Linux' 00:02:09.840 ++ VERSION='39 (Cloud Edition)' 00:02:09.840 ++ ID=fedora 00:02:09.840 ++ VERSION_ID=39 00:02:09.840 ++ VERSION_CODENAME= 00:02:09.840 ++ PLATFORM_ID=platform:f39 00:02:09.840 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:09.840 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:09.840 ++ LOGO=fedora-logo-icon 00:02:09.840 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:09.840 ++ HOME_URL=https://fedoraproject.org/ 00:02:09.840 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:09.840 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:09.840 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:09.840 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:09.840 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:09.840 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:09.840 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:09.840 ++ SUPPORT_END=2024-11-12 00:02:09.840 ++ VARIANT='Cloud Edition' 00:02:09.840 ++ VARIANT_ID=cloud 00:02:09.840 + uname -a 00:02:09.840 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:09.840 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:13.144 Hugepages 00:02:13.144 node hugesize free / total 00:02:13.144 node0 1048576kB 0 / 0 00:02:13.144 node0 2048kB 0 / 0 00:02:13.144 node1 1048576kB 0 / 0 00:02:13.144 node1 2048kB 0 / 0 00:02:13.144 00:02:13.145 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:13.145 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:13.145 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:13.145 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:13.145 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:13.145 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:13.145 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:13.145 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:13.145 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:13.145 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:13.145 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:13.145 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:13.145 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:13.145 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:13.145 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:13.145 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:13.145 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:13.145 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:13.145 + rm -f /tmp/spdk-ld-path 00:02:13.145 + source autorun-spdk.conf 00:02:13.145 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:13.145 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:13.145 ++ SPDK_TEST_FUZZER=1 00:02:13.145 ++ SPDK_TEST_SETUP=1 00:02:13.145 ++ SPDK_RUN_UBSAN=1 00:02:13.145 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:13.145 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.145 ++ RUN_NIGHTLY=1 00:02:13.145 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:13.145 + [[ -n '' ]] 00:02:13.145 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:13.145 + for M in /var/spdk/build-*-manifest.txt 00:02:13.145 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:13.145 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:13.145 + for M in /var/spdk/build-*-manifest.txt 00:02:13.145 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:13.145 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:13.145 + for M in /var/spdk/build-*-manifest.txt 00:02:13.145 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:13.145 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:13.145 ++ uname 00:02:13.145 + [[ Linux == \L\i\n\u\x ]] 00:02:13.145 + sudo dmesg -T 00:02:13.145 + sudo dmesg --clear 00:02:13.145 + dmesg_pid=6196 00:02:13.145 + [[ Fedora Linux == FreeBSD ]] 00:02:13.145 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:13.145 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:13.145 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:13.145 + sudo dmesg -Tw 00:02:13.145 + [[ -x /usr/src/fio-static/fio ]] 00:02:13.145 + export FIO_BIN=/usr/src/fio-static/fio 00:02:13.145 + FIO_BIN=/usr/src/fio-static/fio 00:02:13.145 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:13.145 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:13.145 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:13.145 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:13.145 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:13.145 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:13.145 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:13.145 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:13.145 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:13.145 12:44:16 -- common/autotest_common.sh@1710 -- $ [[ n == y ]] 00:02:13.145 12:44:16 -- spdk/autorun.sh@20 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:13.145 12:44:16 -- short-fuzz-phy-autotest/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:13.145 12:44:16 -- short-fuzz-phy-autotest/autorun-spdk.conf@2 -- $ SPDK_TEST_FUZZER_SHORT=1 00:02:13.145 12:44:16 -- short-fuzz-phy-autotest/autorun-spdk.conf@3 -- $ SPDK_TEST_FUZZER=1 00:02:13.145 12:44:16 -- short-fuzz-phy-autotest/autorun-spdk.conf@4 -- $ SPDK_TEST_SETUP=1 00:02:13.145 12:44:16 -- short-fuzz-phy-autotest/autorun-spdk.conf@5 -- $ SPDK_RUN_UBSAN=1 00:02:13.145 12:44:16 -- short-fuzz-phy-autotest/autorun-spdk.conf@6 -- $ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:13.145 12:44:16 -- short-fuzz-phy-autotest/autorun-spdk.conf@7 -- $ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.145 12:44:16 -- short-fuzz-phy-autotest/autorun-spdk.conf@8 -- $ RUN_NIGHTLY=1 00:02:13.145 12:44:16 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:02:13.145 12:44:16 -- spdk/autorun.sh@25 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autobuild.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:13.145 12:44:16 -- common/autotest_common.sh@1710 -- $ [[ n == y ]] 00:02:13.145 12:44:16 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:13.145 12:44:16 -- scripts/common.sh@15 -- $ shopt -s extglob 00:02:13.145 12:44:16 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:13.145 12:44:16 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:13.145 12:44:16 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:13.145 12:44:16 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.145 12:44:16 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.145 12:44:16 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.145 12:44:16 -- paths/export.sh@5 -- $ export PATH 00:02:13.145 12:44:16 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.145 12:44:16 -- common/autobuild_common.sh@492 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:13.145 12:44:16 -- common/autobuild_common.sh@493 -- $ date +%s 00:02:13.145 12:44:16 -- common/autobuild_common.sh@493 -- $ mktemp -dt spdk_1733399056.XXXXXX 00:02:13.145 12:44:16 -- common/autobuild_common.sh@493 -- $ SPDK_WORKSPACE=/tmp/spdk_1733399056.EFGKi6 00:02:13.145 12:44:16 -- common/autobuild_common.sh@495 -- $ [[ -n '' ]] 00:02:13.145 12:44:16 -- common/autobuild_common.sh@499 -- $ '[' -n v22.11.4 ']' 00:02:13.145 12:44:16 -- common/autobuild_common.sh@500 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.145 12:44:16 -- common/autobuild_common.sh@500 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:02:13.145 12:44:16 -- common/autobuild_common.sh@506 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:13.145 12:44:16 -- common/autobuild_common.sh@508 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:13.145 12:44:16 -- common/autobuild_common.sh@509 -- $ get_config_params 00:02:13.145 12:44:16 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:02:13.145 12:44:16 -- common/autotest_common.sh@10 -- $ set +x 00:02:13.145 12:44:16 -- common/autobuild_common.sh@509 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:02:13.145 12:44:16 -- common/autobuild_common.sh@511 -- $ start_monitor_resources 00:02:13.145 12:44:16 -- pm/common@17 -- $ local monitor 00:02:13.145 12:44:16 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.145 12:44:16 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.145 12:44:16 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.145 12:44:16 -- pm/common@21 -- $ date +%s 00:02:13.145 12:44:16 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.145 12:44:16 -- pm/common@21 -- $ date +%s 00:02:13.145 12:44:16 -- pm/common@25 -- $ sleep 1 00:02:13.145 12:44:16 -- pm/common@21 -- $ date +%s 00:02:13.145 12:44:16 -- pm/common@21 -- $ date +%s 00:02:13.145 12:44:16 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1733399056 00:02:13.145 12:44:16 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1733399056 00:02:13.145 12:44:16 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1733399056 00:02:13.145 12:44:16 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1733399056 00:02:13.406 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1733399056_collect-cpu-load.pm.log 00:02:13.406 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1733399056_collect-vmstat.pm.log 00:02:13.406 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1733399056_collect-cpu-temp.pm.log 00:02:13.406 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1733399056_collect-bmc-pm.bmc.pm.log 00:02:14.349 12:44:17 -- common/autobuild_common.sh@512 -- $ trap stop_monitor_resources EXIT 00:02:14.349 12:44:17 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:14.349 12:44:17 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:14.349 12:44:17 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:14.349 12:44:17 -- spdk/autobuild.sh@16 -- $ date -u 00:02:14.349 Thu Dec 5 11:44:17 AM UTC 2024 00:02:14.349 12:44:17 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:14.349 v25.01-pre-296-g8d3947977 00:02:14.349 12:44:17 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:14.349 12:44:17 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:14.349 12:44:17 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:14.349 12:44:17 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:02:14.349 12:44:17 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:14.349 12:44:17 -- common/autotest_common.sh@10 -- $ set +x 00:02:14.349 ************************************ 00:02:14.349 START TEST ubsan 00:02:14.349 ************************************ 00:02:14.349 12:44:17 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:02:14.349 using ubsan 00:02:14.349 00:02:14.349 real 0m0.001s 00:02:14.349 user 0m0.000s 00:02:14.349 sys 0m0.001s 00:02:14.349 12:44:17 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:14.349 12:44:17 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:14.349 ************************************ 00:02:14.349 END TEST ubsan 00:02:14.349 ************************************ 00:02:14.349 12:44:17 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:02:14.349 12:44:17 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:14.349 12:44:17 -- common/autobuild_common.sh@449 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:14.349 12:44:17 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:14.349 12:44:17 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:14.349 12:44:17 -- common/autotest_common.sh@10 -- $ set +x 00:02:14.349 ************************************ 00:02:14.349 START TEST build_native_dpdk 00:02:14.349 ************************************ 00:02:14.349 12:44:17 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:02:14.349 caf0f5d395 version: 22.11.4 00:02:14.349 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:02:14.349 dc9c799c7d vhost: fix missing spinlock unlock 00:02:14.349 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:02:14.349 6ef77f2a5e net/gve: fix RX buffer size alignment 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:14.349 12:44:17 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@102 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base" "power/acpi" "power/amd_pstate" "power/cppc" "power/intel_pstate" "power/intel_uncore" "power/kvm_vm") 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@103 -- $ local mlx5_libs_added=n 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@146 -- $ [[ 0 -eq 1 ]] 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@174 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@175 -- $ uname -s 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@175 -- $ '[' Linux = Linux ']' 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 22.11.4 21.11.0 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@180 -- $ patch -p1 00:02:14.610 patching file config/rte_config.h 00:02:14.610 Hunk #1 succeeded at 60 (offset 1 line). 00:02:14.610 12:44:17 build_native_dpdk -- common/autobuild_common.sh@183 -- $ lt 22.11.4 24.07.0 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:14.610 12:44:17 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:02:14.611 12:44:17 build_native_dpdk -- common/autobuild_common.sh@184 -- $ patch -p1 00:02:14.611 patching file lib/pcapng/rte_pcapng.c 00:02:14.611 Hunk #1 succeeded at 110 (offset -18 lines). 00:02:14.611 12:44:17 build_native_dpdk -- common/autobuild_common.sh@186 -- $ ge 22.11.4 24.07.0 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 22.11.4 '>=' 24.07.0 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:14.611 12:44:17 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:02:14.611 12:44:17 build_native_dpdk -- common/autobuild_common.sh@190 -- $ dpdk_kmods=false 00:02:14.611 12:44:17 build_native_dpdk -- common/autobuild_common.sh@191 -- $ uname -s 00:02:14.611 12:44:17 build_native_dpdk -- common/autobuild_common.sh@191 -- $ '[' Linux = FreeBSD ']' 00:02:14.611 12:44:17 build_native_dpdk -- common/autobuild_common.sh@195 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base power/acpi power/amd_pstate power/cppc power/intel_pstate power/intel_uncore power/kvm_vm 00:02:14.611 12:44:17 build_native_dpdk -- common/autobuild_common.sh@195 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:02:21.191 The Meson build system 00:02:21.191 Version: 1.5.0 00:02:21.191 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:21.191 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:21.191 Build type: native build 00:02:21.191 Program cat found: YES (/usr/bin/cat) 00:02:21.191 Project name: DPDK 00:02:21.191 Project version: 22.11.4 00:02:21.191 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:21.191 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:21.191 Host machine cpu family: x86_64 00:02:21.191 Host machine cpu: x86_64 00:02:21.191 Message: ## Building in Developer Mode ## 00:02:21.191 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:21.191 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:21.191 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:21.192 Program objdump found: YES (/usr/bin/objdump) 00:02:21.192 Program python3 found: YES (/usr/bin/python3) 00:02:21.192 Program cat found: YES (/usr/bin/cat) 00:02:21.192 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:21.192 Checking for size of "void *" : 8 00:02:21.192 Checking for size of "void *" : 8 (cached) 00:02:21.192 Library m found: YES 00:02:21.192 Library numa found: YES 00:02:21.192 Has header "numaif.h" : YES 00:02:21.192 Library fdt found: NO 00:02:21.192 Library execinfo found: NO 00:02:21.192 Has header "execinfo.h" : YES 00:02:21.192 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:21.192 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:21.192 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:21.192 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:21.192 Run-time dependency openssl found: YES 3.1.1 00:02:21.192 Run-time dependency libpcap found: YES 1.10.4 00:02:21.192 Has header "pcap.h" with dependency libpcap: YES 00:02:21.192 Compiler for C supports arguments -Wcast-qual: YES 00:02:21.192 Compiler for C supports arguments -Wdeprecated: YES 00:02:21.192 Compiler for C supports arguments -Wformat: YES 00:02:21.192 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:21.192 Compiler for C supports arguments -Wformat-security: NO 00:02:21.192 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:21.192 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:21.192 Compiler for C supports arguments -Wnested-externs: YES 00:02:21.192 Compiler for C supports arguments -Wold-style-definition: YES 00:02:21.192 Compiler for C supports arguments -Wpointer-arith: YES 00:02:21.192 Compiler for C supports arguments -Wsign-compare: YES 00:02:21.192 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:21.192 Compiler for C supports arguments -Wundef: YES 00:02:21.192 Compiler for C supports arguments -Wwrite-strings: YES 00:02:21.192 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:21.192 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:21.192 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:21.192 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:21.192 Compiler for C supports arguments -mavx512f: YES 00:02:21.192 Checking if "AVX512 checking" compiles: YES 00:02:21.192 Fetching value of define "__SSE4_2__" : 1 00:02:21.192 Fetching value of define "__AES__" : 1 00:02:21.192 Fetching value of define "__AVX__" : 1 00:02:21.192 Fetching value of define "__AVX2__" : 1 00:02:21.192 Fetching value of define "__AVX512BW__" : 1 00:02:21.192 Fetching value of define "__AVX512CD__" : 1 00:02:21.192 Fetching value of define "__AVX512DQ__" : 1 00:02:21.192 Fetching value of define "__AVX512F__" : 1 00:02:21.192 Fetching value of define "__AVX512VL__" : 1 00:02:21.192 Fetching value of define "__PCLMUL__" : 1 00:02:21.192 Fetching value of define "__RDRND__" : 1 00:02:21.192 Fetching value of define "__RDSEED__" : 1 00:02:21.192 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:21.192 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:21.192 Message: lib/kvargs: Defining dependency "kvargs" 00:02:21.192 Message: lib/telemetry: Defining dependency "telemetry" 00:02:21.192 Checking for function "getentropy" : YES 00:02:21.192 Message: lib/eal: Defining dependency "eal" 00:02:21.192 Message: lib/ring: Defining dependency "ring" 00:02:21.192 Message: lib/rcu: Defining dependency "rcu" 00:02:21.192 Message: lib/mempool: Defining dependency "mempool" 00:02:21.192 Message: lib/mbuf: Defining dependency "mbuf" 00:02:21.192 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:21.192 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:21.192 Compiler for C supports arguments -mpclmul: YES 00:02:21.192 Compiler for C supports arguments -maes: YES 00:02:21.192 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:21.192 Compiler for C supports arguments -mavx512bw: YES 00:02:21.192 Compiler for C supports arguments -mavx512dq: YES 00:02:21.192 Compiler for C supports arguments -mavx512vl: YES 00:02:21.192 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:21.192 Compiler for C supports arguments -mavx2: YES 00:02:21.192 Compiler for C supports arguments -mavx: YES 00:02:21.192 Message: lib/net: Defining dependency "net" 00:02:21.192 Message: lib/meter: Defining dependency "meter" 00:02:21.192 Message: lib/ethdev: Defining dependency "ethdev" 00:02:21.192 Message: lib/pci: Defining dependency "pci" 00:02:21.192 Message: lib/cmdline: Defining dependency "cmdline" 00:02:21.192 Message: lib/metrics: Defining dependency "metrics" 00:02:21.192 Message: lib/hash: Defining dependency "hash" 00:02:21.192 Message: lib/timer: Defining dependency "timer" 00:02:21.192 Fetching value of define "__AVX2__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.192 Message: lib/acl: Defining dependency "acl" 00:02:21.192 Message: lib/bbdev: Defining dependency "bbdev" 00:02:21.192 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:21.192 Run-time dependency libelf found: YES 0.191 00:02:21.192 Message: lib/bpf: Defining dependency "bpf" 00:02:21.192 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:21.192 Message: lib/compressdev: Defining dependency "compressdev" 00:02:21.192 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:21.192 Message: lib/distributor: Defining dependency "distributor" 00:02:21.192 Message: lib/efd: Defining dependency "efd" 00:02:21.192 Message: lib/eventdev: Defining dependency "eventdev" 00:02:21.192 Message: lib/gpudev: Defining dependency "gpudev" 00:02:21.192 Message: lib/gro: Defining dependency "gro" 00:02:21.192 Message: lib/gso: Defining dependency "gso" 00:02:21.192 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:21.192 Message: lib/jobstats: Defining dependency "jobstats" 00:02:21.192 Message: lib/latencystats: Defining dependency "latencystats" 00:02:21.192 Message: lib/lpm: Defining dependency "lpm" 00:02:21.192 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:21.192 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:21.192 Message: lib/member: Defining dependency "member" 00:02:21.192 Message: lib/pcapng: Defining dependency "pcapng" 00:02:21.192 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:21.192 Message: lib/power: Defining dependency "power" 00:02:21.192 Message: lib/rawdev: Defining dependency "rawdev" 00:02:21.192 Message: lib/regexdev: Defining dependency "regexdev" 00:02:21.192 Message: lib/dmadev: Defining dependency "dmadev" 00:02:21.192 Message: lib/rib: Defining dependency "rib" 00:02:21.192 Message: lib/reorder: Defining dependency "reorder" 00:02:21.192 Message: lib/sched: Defining dependency "sched" 00:02:21.192 Message: lib/security: Defining dependency "security" 00:02:21.192 Message: lib/stack: Defining dependency "stack" 00:02:21.192 Has header "linux/userfaultfd.h" : YES 00:02:21.192 Message: lib/vhost: Defining dependency "vhost" 00:02:21.192 Message: lib/ipsec: Defining dependency "ipsec" 00:02:21.192 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:21.192 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.192 Message: lib/fib: Defining dependency "fib" 00:02:21.192 Message: lib/port: Defining dependency "port" 00:02:21.192 Message: lib/pdump: Defining dependency "pdump" 00:02:21.192 Message: lib/table: Defining dependency "table" 00:02:21.192 Message: lib/pipeline: Defining dependency "pipeline" 00:02:21.192 Message: lib/graph: Defining dependency "graph" 00:02:21.192 Message: lib/node: Defining dependency "node" 00:02:21.192 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:21.192 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:21.192 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:21.192 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:21.192 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:21.192 Compiler for C supports arguments -Wno-unused-value: YES 00:02:21.192 Compiler for C supports arguments -Wno-format: YES 00:02:21.192 Compiler for C supports arguments -Wno-format-security: YES 00:02:21.192 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:21.766 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:21.766 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:21.766 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:21.766 Fetching value of define "__AVX2__" : 1 (cached) 00:02:21.766 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.766 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.766 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:21.766 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:21.766 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:21.766 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:21.766 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:21.766 Configuring doxy-api.conf using configuration 00:02:21.766 Program sphinx-build found: NO 00:02:21.766 Configuring rte_build_config.h using configuration 00:02:21.766 Message: 00:02:21.766 ================= 00:02:21.766 Applications Enabled 00:02:21.766 ================= 00:02:21.766 00:02:21.766 apps: 00:02:21.766 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:02:21.766 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:02:21.766 test-security-perf, 00:02:21.766 00:02:21.766 Message: 00:02:21.766 ================= 00:02:21.766 Libraries Enabled 00:02:21.766 ================= 00:02:21.766 00:02:21.766 libs: 00:02:21.766 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:02:21.766 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:02:21.766 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:02:21.766 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:02:21.766 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:02:21.766 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:02:21.766 table, pipeline, graph, node, 00:02:21.766 00:02:21.766 Message: 00:02:21.766 =============== 00:02:21.766 Drivers Enabled 00:02:21.766 =============== 00:02:21.766 00:02:21.766 common: 00:02:21.766 00:02:21.766 bus: 00:02:21.766 pci, vdev, 00:02:21.766 mempool: 00:02:21.766 ring, 00:02:21.766 dma: 00:02:21.766 00:02:21.766 net: 00:02:21.766 i40e, 00:02:21.766 raw: 00:02:21.766 00:02:21.766 crypto: 00:02:21.766 00:02:21.766 compress: 00:02:21.766 00:02:21.766 regex: 00:02:21.766 00:02:21.766 vdpa: 00:02:21.766 00:02:21.766 event: 00:02:21.766 00:02:21.766 baseband: 00:02:21.766 00:02:21.766 gpu: 00:02:21.766 00:02:21.766 00:02:21.766 Message: 00:02:21.766 ================= 00:02:21.766 Content Skipped 00:02:21.766 ================= 00:02:21.766 00:02:21.766 apps: 00:02:21.766 00:02:21.766 libs: 00:02:21.766 kni: explicitly disabled via build config (deprecated lib) 00:02:21.766 flow_classify: explicitly disabled via build config (deprecated lib) 00:02:21.766 00:02:21.766 drivers: 00:02:21.766 common/cpt: not in enabled drivers build config 00:02:21.766 common/dpaax: not in enabled drivers build config 00:02:21.766 common/iavf: not in enabled drivers build config 00:02:21.766 common/idpf: not in enabled drivers build config 00:02:21.766 common/mvep: not in enabled drivers build config 00:02:21.766 common/octeontx: not in enabled drivers build config 00:02:21.766 bus/auxiliary: not in enabled drivers build config 00:02:21.766 bus/dpaa: not in enabled drivers build config 00:02:21.766 bus/fslmc: not in enabled drivers build config 00:02:21.766 bus/ifpga: not in enabled drivers build config 00:02:21.766 bus/vmbus: not in enabled drivers build config 00:02:21.766 common/cnxk: not in enabled drivers build config 00:02:21.766 common/mlx5: not in enabled drivers build config 00:02:21.766 common/qat: not in enabled drivers build config 00:02:21.766 common/sfc_efx: not in enabled drivers build config 00:02:21.766 mempool/bucket: not in enabled drivers build config 00:02:21.766 mempool/cnxk: not in enabled drivers build config 00:02:21.766 mempool/dpaa: not in enabled drivers build config 00:02:21.766 mempool/dpaa2: not in enabled drivers build config 00:02:21.766 mempool/octeontx: not in enabled drivers build config 00:02:21.766 mempool/stack: not in enabled drivers build config 00:02:21.766 dma/cnxk: not in enabled drivers build config 00:02:21.766 dma/dpaa: not in enabled drivers build config 00:02:21.766 dma/dpaa2: not in enabled drivers build config 00:02:21.766 dma/hisilicon: not in enabled drivers build config 00:02:21.766 dma/idxd: not in enabled drivers build config 00:02:21.766 dma/ioat: not in enabled drivers build config 00:02:21.766 dma/skeleton: not in enabled drivers build config 00:02:21.766 net/af_packet: not in enabled drivers build config 00:02:21.766 net/af_xdp: not in enabled drivers build config 00:02:21.766 net/ark: not in enabled drivers build config 00:02:21.766 net/atlantic: not in enabled drivers build config 00:02:21.766 net/avp: not in enabled drivers build config 00:02:21.766 net/axgbe: not in enabled drivers build config 00:02:21.766 net/bnx2x: not in enabled drivers build config 00:02:21.766 net/bnxt: not in enabled drivers build config 00:02:21.766 net/bonding: not in enabled drivers build config 00:02:21.766 net/cnxk: not in enabled drivers build config 00:02:21.766 net/cxgbe: not in enabled drivers build config 00:02:21.766 net/dpaa: not in enabled drivers build config 00:02:21.766 net/dpaa2: not in enabled drivers build config 00:02:21.766 net/e1000: not in enabled drivers build config 00:02:21.766 net/ena: not in enabled drivers build config 00:02:21.766 net/enetc: not in enabled drivers build config 00:02:21.766 net/enetfec: not in enabled drivers build config 00:02:21.766 net/enic: not in enabled drivers build config 00:02:21.766 net/failsafe: not in enabled drivers build config 00:02:21.766 net/fm10k: not in enabled drivers build config 00:02:21.766 net/gve: not in enabled drivers build config 00:02:21.766 net/hinic: not in enabled drivers build config 00:02:21.766 net/hns3: not in enabled drivers build config 00:02:21.766 net/iavf: not in enabled drivers build config 00:02:21.766 net/ice: not in enabled drivers build config 00:02:21.766 net/idpf: not in enabled drivers build config 00:02:21.766 net/igc: not in enabled drivers build config 00:02:21.766 net/ionic: not in enabled drivers build config 00:02:21.766 net/ipn3ke: not in enabled drivers build config 00:02:21.766 net/ixgbe: not in enabled drivers build config 00:02:21.766 net/kni: not in enabled drivers build config 00:02:21.766 net/liquidio: not in enabled drivers build config 00:02:21.766 net/mana: not in enabled drivers build config 00:02:21.766 net/memif: not in enabled drivers build config 00:02:21.766 net/mlx4: not in enabled drivers build config 00:02:21.766 net/mlx5: not in enabled drivers build config 00:02:21.766 net/mvneta: not in enabled drivers build config 00:02:21.766 net/mvpp2: not in enabled drivers build config 00:02:21.767 net/netvsc: not in enabled drivers build config 00:02:21.767 net/nfb: not in enabled drivers build config 00:02:21.767 net/nfp: not in enabled drivers build config 00:02:21.767 net/ngbe: not in enabled drivers build config 00:02:21.767 net/null: not in enabled drivers build config 00:02:21.767 net/octeontx: not in enabled drivers build config 00:02:21.767 net/octeon_ep: not in enabled drivers build config 00:02:21.767 net/pcap: not in enabled drivers build config 00:02:21.767 net/pfe: not in enabled drivers build config 00:02:21.767 net/qede: not in enabled drivers build config 00:02:21.767 net/ring: not in enabled drivers build config 00:02:21.767 net/sfc: not in enabled drivers build config 00:02:21.767 net/softnic: not in enabled drivers build config 00:02:21.767 net/tap: not in enabled drivers build config 00:02:21.767 net/thunderx: not in enabled drivers build config 00:02:21.767 net/txgbe: not in enabled drivers build config 00:02:21.767 net/vdev_netvsc: not in enabled drivers build config 00:02:21.767 net/vhost: not in enabled drivers build config 00:02:21.767 net/virtio: not in enabled drivers build config 00:02:21.767 net/vmxnet3: not in enabled drivers build config 00:02:21.767 raw/cnxk_bphy: not in enabled drivers build config 00:02:21.767 raw/cnxk_gpio: not in enabled drivers build config 00:02:21.767 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:21.767 raw/ifpga: not in enabled drivers build config 00:02:21.767 raw/ntb: not in enabled drivers build config 00:02:21.767 raw/skeleton: not in enabled drivers build config 00:02:21.767 crypto/armv8: not in enabled drivers build config 00:02:21.767 crypto/bcmfs: not in enabled drivers build config 00:02:21.767 crypto/caam_jr: not in enabled drivers build config 00:02:21.767 crypto/ccp: not in enabled drivers build config 00:02:21.767 crypto/cnxk: not in enabled drivers build config 00:02:21.767 crypto/dpaa_sec: not in enabled drivers build config 00:02:21.767 crypto/dpaa2_sec: not in enabled drivers build config 00:02:21.767 crypto/ipsec_mb: not in enabled drivers build config 00:02:21.767 crypto/mlx5: not in enabled drivers build config 00:02:21.767 crypto/mvsam: not in enabled drivers build config 00:02:21.767 crypto/nitrox: not in enabled drivers build config 00:02:21.767 crypto/null: not in enabled drivers build config 00:02:21.767 crypto/octeontx: not in enabled drivers build config 00:02:21.767 crypto/openssl: not in enabled drivers build config 00:02:21.767 crypto/scheduler: not in enabled drivers build config 00:02:21.767 crypto/uadk: not in enabled drivers build config 00:02:21.767 crypto/virtio: not in enabled drivers build config 00:02:21.767 compress/isal: not in enabled drivers build config 00:02:21.767 compress/mlx5: not in enabled drivers build config 00:02:21.767 compress/octeontx: not in enabled drivers build config 00:02:21.767 compress/zlib: not in enabled drivers build config 00:02:21.767 regex/mlx5: not in enabled drivers build config 00:02:21.767 regex/cn9k: not in enabled drivers build config 00:02:21.767 vdpa/ifc: not in enabled drivers build config 00:02:21.767 vdpa/mlx5: not in enabled drivers build config 00:02:21.767 vdpa/sfc: not in enabled drivers build config 00:02:21.767 event/cnxk: not in enabled drivers build config 00:02:21.767 event/dlb2: not in enabled drivers build config 00:02:21.767 event/dpaa: not in enabled drivers build config 00:02:21.767 event/dpaa2: not in enabled drivers build config 00:02:21.767 event/dsw: not in enabled drivers build config 00:02:21.767 event/opdl: not in enabled drivers build config 00:02:21.767 event/skeleton: not in enabled drivers build config 00:02:21.767 event/sw: not in enabled drivers build config 00:02:21.767 event/octeontx: not in enabled drivers build config 00:02:21.767 baseband/acc: not in enabled drivers build config 00:02:21.767 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:21.767 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:21.767 baseband/la12xx: not in enabled drivers build config 00:02:21.767 baseband/null: not in enabled drivers build config 00:02:21.767 baseband/turbo_sw: not in enabled drivers build config 00:02:21.767 gpu/cuda: not in enabled drivers build config 00:02:21.767 00:02:21.767 00:02:21.767 Build targets in project: 311 00:02:21.767 00:02:21.767 DPDK 22.11.4 00:02:21.767 00:02:21.767 User defined options 00:02:21.767 libdir : lib 00:02:21.767 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:21.767 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:21.767 c_link_args : 00:02:21.767 enable_docs : false 00:02:21.767 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:02:21.767 enable_kmods : false 00:02:21.767 machine : native 00:02:21.767 tests : false 00:02:21.767 00:02:21.767 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:21.767 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:21.767 12:44:24 build_native_dpdk -- common/autobuild_common.sh@199 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:21.767 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:21.767 [1/740] Generating lib/rte_kvargs_def with a custom command 00:02:21.767 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:02:21.767 [3/740] Generating lib/rte_telemetry_def with a custom command 00:02:21.767 [4/740] Generating lib/rte_telemetry_mingw with a custom command 00:02:22.032 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:22.032 [6/740] Generating lib/rte_rcu_def with a custom command 00:02:22.032 [7/740] Generating lib/rte_eal_mingw with a custom command 00:02:22.032 [8/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:22.032 [9/740] Generating lib/rte_ring_def with a custom command 00:02:22.032 [10/740] Generating lib/rte_mempool_def with a custom command 00:02:22.032 [11/740] Generating lib/rte_mempool_mingw with a custom command 00:02:22.032 [12/740] Generating lib/rte_ring_mingw with a custom command 00:02:22.032 [13/740] Generating lib/rte_rcu_mingw with a custom command 00:02:22.032 [14/740] Generating lib/rte_eal_def with a custom command 00:02:22.032 [15/740] Generating lib/rte_net_def with a custom command 00:02:22.032 [16/740] Generating lib/rte_net_mingw with a custom command 00:02:22.032 [17/740] Generating lib/rte_meter_def with a custom command 00:02:22.032 [18/740] Generating lib/rte_meter_mingw with a custom command 00:02:22.032 [19/740] Generating lib/rte_mbuf_def with a custom command 00:02:22.032 [20/740] Generating lib/rte_mbuf_mingw with a custom command 00:02:22.032 [21/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:22.032 [22/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:22.032 [23/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:22.032 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:22.032 [25/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:22.032 [26/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:02:22.032 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:22.032 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:22.032 [29/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:22.032 [30/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:22.032 [31/740] Generating lib/rte_pci_def with a custom command 00:02:22.032 [32/740] Generating lib/rte_ethdev_def with a custom command 00:02:22.032 [33/740] Generating lib/rte_ethdev_mingw with a custom command 00:02:22.032 [34/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:22.032 [35/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:22.032 [36/740] Generating lib/rte_pci_mingw with a custom command 00:02:22.032 [37/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:22.032 [38/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:22.032 [39/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:22.032 [40/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:22.032 [41/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:22.032 [42/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:22.032 [43/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:22.032 [44/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:22.032 [45/740] Linking static target lib/librte_kvargs.a 00:02:22.032 [46/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:22.032 [47/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:22.032 [48/740] Generating lib/rte_cmdline_def with a custom command 00:02:22.032 [49/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:22.032 [50/740] Generating lib/rte_cmdline_mingw with a custom command 00:02:22.032 [51/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:22.032 [52/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:22.032 [53/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:22.032 [54/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:22.032 [55/740] Generating lib/rte_metrics_def with a custom command 00:02:22.032 [56/740] Generating lib/rte_hash_def with a custom command 00:02:22.032 [57/740] Generating lib/rte_hash_mingw with a custom command 00:02:22.032 [58/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:22.032 [59/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:22.032 [60/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:22.032 [61/740] Generating lib/rte_metrics_mingw with a custom command 00:02:22.032 [62/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:22.032 [63/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:22.032 [64/740] Generating lib/rte_timer_def with a custom command 00:02:22.032 [65/740] Generating lib/rte_timer_mingw with a custom command 00:02:22.032 [66/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:22.294 [67/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:22.294 [68/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:22.294 [69/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:22.294 [70/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:22.294 [71/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:22.294 [72/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:22.294 [73/740] Generating lib/rte_acl_def with a custom command 00:02:22.294 [74/740] Generating lib/rte_acl_mingw with a custom command 00:02:22.294 [75/740] Generating lib/rte_bitratestats_def with a custom command 00:02:22.294 [76/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:22.294 [77/740] Generating lib/rte_bbdev_def with a custom command 00:02:22.294 [78/740] Linking static target lib/librte_pci.a 00:02:22.294 [79/740] Generating lib/rte_bbdev_mingw with a custom command 00:02:22.294 [80/740] Generating lib/rte_bitratestats_mingw with a custom command 00:02:22.294 [81/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:22.294 [82/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:22.294 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:22.294 [84/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:22.294 [85/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:22.294 [86/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:22.294 [87/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:22.294 [88/740] Generating lib/rte_bpf_def with a custom command 00:02:22.294 [89/740] Generating lib/rte_bpf_mingw with a custom command 00:02:22.294 [90/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:22.294 [91/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:22.294 [92/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:22.294 [93/740] Linking static target lib/librte_ring.a 00:02:22.294 [94/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:22.294 [95/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:22.294 [96/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:22.294 [97/740] Generating lib/rte_cfgfile_mingw with a custom command 00:02:22.294 [98/740] Generating lib/rte_cfgfile_def with a custom command 00:02:22.294 [99/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:02:22.294 [100/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:22.294 [101/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:22.294 [102/740] Linking static target lib/librte_meter.a 00:02:22.294 [103/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:22.294 [104/740] Generating lib/rte_compressdev_def with a custom command 00:02:22.294 [105/740] Generating lib/rte_compressdev_mingw with a custom command 00:02:22.294 [106/740] Generating lib/rte_cryptodev_def with a custom command 00:02:22.294 [107/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:22.294 [108/740] Generating lib/rte_cryptodev_mingw with a custom command 00:02:22.294 [109/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:22.294 [110/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:22.294 [111/740] Generating lib/rte_distributor_def with a custom command 00:02:22.294 [112/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:22.294 [113/740] Generating lib/rte_distributor_mingw with a custom command 00:02:22.294 [114/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:22.294 [115/740] Generating lib/rte_efd_def with a custom command 00:02:22.294 [116/740] Generating lib/rte_efd_mingw with a custom command 00:02:22.294 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:22.294 [118/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:22.294 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:22.294 [120/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:22.294 [121/740] Generating lib/rte_eventdev_mingw with a custom command 00:02:22.295 [122/740] Generating lib/rte_eventdev_def with a custom command 00:02:22.295 [123/740] Generating lib/rte_gpudev_def with a custom command 00:02:22.295 [124/740] Generating lib/rte_gpudev_mingw with a custom command 00:02:22.295 [125/740] Generating lib/rte_gro_def with a custom command 00:02:22.295 [126/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:22.295 [127/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:22.295 [128/740] Generating lib/rte_gro_mingw with a custom command 00:02:22.295 [129/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:22.295 [130/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:22.295 [131/740] Generating lib/rte_gso_mingw with a custom command 00:02:22.295 [132/740] Generating lib/rte_gso_def with a custom command 00:02:22.563 [133/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:22.563 [134/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:22.563 [135/740] Generating lib/rte_ip_frag_def with a custom command 00:02:22.563 [136/740] Generating lib/rte_ip_frag_mingw with a custom command 00:02:22.563 [137/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:22.563 [138/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.563 [139/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.563 [140/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:22.563 [141/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:22.563 [142/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:22.564 [143/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:22.564 [144/740] Generating lib/rte_jobstats_mingw with a custom command 00:02:22.564 [145/740] Generating lib/rte_jobstats_def with a custom command 00:02:22.564 [146/740] Linking static target lib/librte_cfgfile.a 00:02:22.564 [147/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:22.564 [148/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:22.564 [149/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.564 [150/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:22.564 [151/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:22.564 [152/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:22.564 [153/740] Generating lib/rte_latencystats_def with a custom command 00:02:22.564 [154/740] Generating lib/rte_latencystats_mingw with a custom command 00:02:22.564 [155/740] Generating lib/rte_lpm_def with a custom command 00:02:22.564 [156/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:22.564 [157/740] Generating lib/rte_lpm_mingw with a custom command 00:02:22.564 [158/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:22.564 [159/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.564 [160/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:22.564 [161/740] Linking target lib/librte_kvargs.so.23.0 00:02:22.564 [162/740] Generating lib/rte_member_def with a custom command 00:02:22.564 [163/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:22.828 [164/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:22.828 [165/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:22.828 [166/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:22.828 [167/740] Generating lib/rte_member_mingw with a custom command 00:02:22.829 [168/740] Generating lib/rte_pcapng_def with a custom command 00:02:22.829 [169/740] Generating lib/rte_pcapng_mingw with a custom command 00:02:22.829 [170/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:22.829 [171/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:22.829 [172/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:22.829 [173/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:22.829 [174/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:22.829 [175/740] Linking static target lib/librte_jobstats.a 00:02:22.829 [176/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:22.829 [177/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:22.829 [178/740] Generating lib/rte_power_def with a custom command 00:02:22.829 [179/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:22.829 [180/740] Linking static target lib/librte_timer.a 00:02:22.829 [181/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:22.829 [182/740] Generating lib/rte_power_mingw with a custom command 00:02:22.829 [183/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:22.829 [184/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:22.829 [185/740] Generating lib/rte_rawdev_def with a custom command 00:02:22.829 [186/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:22.829 [187/740] Generating lib/rte_rawdev_mingw with a custom command 00:02:22.829 [188/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:22.829 [189/740] Linking static target lib/librte_cmdline.a 00:02:22.829 [190/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:22.829 [191/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:22.829 [192/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:22.829 [193/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:22.829 [194/740] Generating lib/rte_regexdev_def with a custom command 00:02:22.829 [195/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:22.829 [196/740] Linking static target lib/librte_metrics.a 00:02:22.829 [197/740] Generating lib/rte_regexdev_mingw with a custom command 00:02:22.829 [198/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:22.829 [199/740] Generating lib/rte_dmadev_def with a custom command 00:02:22.829 [200/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:22.829 [201/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:22.829 [202/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:22.829 [203/740] Generating lib/rte_dmadev_mingw with a custom command 00:02:22.829 [204/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:22.829 [205/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:22.829 [206/740] Linking static target lib/librte_telemetry.a 00:02:22.829 [207/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:22.829 [208/740] Generating lib/rte_rib_def with a custom command 00:02:22.829 [209/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:22.829 [210/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:22.829 [211/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:22.829 [212/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:22.829 [213/740] Generating lib/rte_reorder_def with a custom command 00:02:22.829 [214/740] Generating lib/rte_reorder_mingw with a custom command 00:02:22.829 [215/740] Generating lib/rte_rib_mingw with a custom command 00:02:22.829 [216/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:22.829 [217/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:22.829 [218/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:22.829 [219/740] Generating lib/rte_sched_mingw with a custom command 00:02:22.829 [220/740] Generating lib/rte_sched_def with a custom command 00:02:22.829 [221/740] Generating lib/rte_security_mingw with a custom command 00:02:22.829 [222/740] Generating lib/rte_security_def with a custom command 00:02:22.829 [223/740] Linking static target lib/librte_net.a 00:02:22.829 [224/740] Generating lib/rte_stack_mingw with a custom command 00:02:22.829 [225/740] Generating lib/rte_stack_def with a custom command 00:02:22.829 [226/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:22.829 [227/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:22.829 [228/740] Linking static target lib/librte_bitratestats.a 00:02:22.829 [229/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:22.829 [230/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:22.829 [231/740] Generating lib/rte_vhost_def with a custom command 00:02:22.829 [232/740] Generating lib/rte_vhost_mingw with a custom command 00:02:22.829 [233/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:23.093 [234/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:23.093 [235/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:23.093 [236/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:23.093 [237/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:23.093 [238/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:23.093 [239/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:23.093 [240/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:23.093 [241/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:23.093 [242/740] Generating lib/rte_ipsec_def with a custom command 00:02:23.093 [243/740] Generating lib/rte_ipsec_mingw with a custom command 00:02:23.093 [244/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:23.093 [245/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:23.093 [246/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:23.093 [247/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:23.093 [248/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:23.093 [249/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:23.093 [250/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:02:23.093 [251/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:23.093 [252/740] Linking static target lib/librte_stack.a 00:02:23.093 [253/740] Generating lib/rte_fib_def with a custom command 00:02:23.093 [254/740] Generating lib/rte_fib_mingw with a custom command 00:02:23.093 [255/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:23.093 [256/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:23.093 [257/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:23.093 [258/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:23.093 [259/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:23.093 [260/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:23.093 [261/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:23.093 [262/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:23.093 [263/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:23.093 [264/740] Generating lib/rte_port_mingw with a custom command 00:02:23.093 [265/740] Generating lib/rte_port_def with a custom command 00:02:23.093 [266/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:23.093 [267/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:23.093 [268/740] Generating lib/rte_pdump_def with a custom command 00:02:23.093 [269/740] Linking static target lib/librte_compressdev.a 00:02:23.094 [270/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.094 [271/740] Generating lib/rte_pdump_mingw with a custom command 00:02:23.094 [272/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:23.094 [273/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:23.094 [274/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:23.094 [275/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:23.094 [276/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:23.094 [277/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:23.094 [278/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:23.094 [279/740] Linking static target lib/librte_rcu.a 00:02:23.094 [280/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:23.358 [281/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.358 [282/740] Linking static target lib/librte_mempool.a 00:02:23.358 [283/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:23.358 [284/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:23.358 [285/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:23.358 [286/740] Linking static target lib/librte_rawdev.a 00:02:23.358 [287/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:23.358 [288/740] Generating lib/rte_table_def with a custom command 00:02:23.358 [289/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:23.358 [290/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:23.358 [291/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.358 [292/740] Linking static target lib/librte_gpudev.a 00:02:23.358 [293/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:23.358 [294/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:23.359 [295/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:23.359 [296/740] Linking static target lib/librte_gro.a 00:02:23.359 [297/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.359 [298/740] Linking static target lib/librte_dmadev.a 00:02:23.359 [299/740] Linking static target lib/librte_bbdev.a 00:02:23.359 [300/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:23.359 [301/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:23.359 [302/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.359 [303/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.359 [304/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:23.359 [305/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:23.359 [306/740] Generating lib/rte_table_mingw with a custom command 00:02:23.359 [307/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:23.359 [308/740] Generating lib/rte_pipeline_def with a custom command 00:02:23.359 [309/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:02:23.359 [310/740] Generating lib/rte_pipeline_mingw with a custom command 00:02:23.359 [311/740] Linking static target lib/librte_gso.a 00:02:23.359 [312/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:23.359 [313/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.359 [314/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.359 [315/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:23.359 [316/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:23.359 [317/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:23.359 [318/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:23.359 [319/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:23.359 [320/740] Linking static target lib/librte_latencystats.a 00:02:23.359 [321/740] Generating lib/rte_graph_def with a custom command 00:02:23.359 [322/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:23.627 [323/740] Linking static target lib/librte_distributor.a 00:02:23.627 [324/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:23.627 [325/740] Linking target lib/librte_telemetry.so.23.0 00:02:23.627 [326/740] Generating lib/rte_graph_mingw with a custom command 00:02:23.627 [327/740] Linking static target lib/librte_ip_frag.a 00:02:23.627 [328/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:23.627 [329/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:23.627 [330/740] Linking static target lib/librte_regexdev.a 00:02:23.627 [331/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:23.627 [332/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:23.627 [333/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:23.627 [334/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:23.627 [335/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:23.627 [336/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:23.627 [337/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:23.627 [338/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:23.627 [339/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:23.628 [340/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:23.628 [341/740] Generating lib/rte_node_def with a custom command 00:02:23.628 [342/740] Generating lib/rte_node_mingw with a custom command 00:02:23.628 [343/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.628 [344/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:23.628 [345/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.628 [346/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:23.628 [347/740] Generating drivers/rte_bus_pci_def with a custom command 00:02:23.628 [348/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:23.628 [349/740] Linking static target lib/librte_reorder.a 00:02:23.628 [350/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:23.628 [351/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:23.628 [352/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:23.628 [353/740] Linking static target lib/librte_power.a 00:02:23.628 [354/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:02:23.896 [355/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:23.896 [356/740] Linking static target lib/librte_eal.a 00:02:23.896 [357/740] Generating drivers/rte_bus_vdev_def with a custom command 00:02:23.896 [358/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:23.896 [359/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.896 [360/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:02:23.896 [361/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:23.896 [362/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:23.896 [363/740] Generating drivers/rte_mempool_ring_def with a custom command 00:02:23.896 [364/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:23.896 [365/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:23.896 [366/740] Linking static target lib/librte_security.a 00:02:23.896 [367/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:02:23.896 [368/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:23.896 [369/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:23.896 [370/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.896 [371/740] Linking static target lib/librte_pcapng.a 00:02:23.896 [372/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:23.896 [373/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:23.896 [374/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:23.896 [375/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:23.896 [376/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:23.896 [377/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:23.896 [378/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.896 [379/740] Linking static target lib/librte_mbuf.a 00:02:23.896 [380/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:23.896 [381/740] Linking static target lib/librte_bpf.a 00:02:23.896 [382/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:23.896 [383/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:23.896 [384/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:23.896 [385/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.896 [386/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:23.896 [387/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:23.896 [388/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:23.896 [389/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.896 [390/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:23.896 [391/740] Generating drivers/rte_net_i40e_def with a custom command 00:02:24.164 [392/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:24.164 [393/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:02:24.164 [394/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:24.164 [395/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:24.164 [396/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:24.164 [397/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:24.164 [398/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:24.164 [399/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:24.164 [400/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:24.164 [401/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:24.164 [402/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:24.164 [403/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:24.164 [404/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:24.164 [405/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:24.164 [406/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:24.164 [407/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:24.164 [408/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:24.164 [409/740] Linking static target lib/librte_rib.a 00:02:24.164 [410/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.164 [411/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:24.164 [412/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:24.164 [413/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:24.164 [414/740] Linking static target lib/librte_lpm.a 00:02:24.164 [415/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:24.164 [416/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:24.164 [417/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:24.164 [418/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:24.164 [419/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:24.164 [420/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:24.164 [421/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:24.164 [422/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:24.164 [423/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:24.164 [424/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:24.164 [425/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.164 [426/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:24.164 [427/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:24.164 [428/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.165 [429/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:24.165 [430/740] Linking static target lib/librte_graph.a 00:02:24.165 [431/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:24.429 [432/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:24.429 [433/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:24.429 [434/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:24.429 [435/740] Linking static target lib/librte_efd.a 00:02:24.429 [436/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:24.429 [437/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:24.429 [438/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.429 [439/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.429 [440/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:24.429 [441/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:24.429 [442/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:24.429 [443/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:24.429 [444/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.429 [445/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:24.429 [446/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:24.429 [447/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.702 [448/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:24.702 [449/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.702 [450/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:24.702 [451/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.702 [452/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:24.702 [453/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:24.702 [454/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:24.702 [455/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:24.702 [456/740] Linking static target lib/librte_fib.a 00:02:24.702 [457/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:24.702 [458/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:24.702 [459/740] Linking static target drivers/librte_bus_vdev.a 00:02:24.702 [460/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.702 [461/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.702 [462/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:24.702 [463/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:24.702 [464/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.702 [465/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:24.702 [466/740] Linking static target lib/librte_pdump.a 00:02:24.702 [467/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:24.702 [468/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.702 [469/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:24.702 [470/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:24.979 [471/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:24.979 [472/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:24.979 [473/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.979 [474/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.979 [475/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:24.979 [476/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:24.979 [477/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:24.979 [478/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:24.979 [479/740] Linking static target drivers/librte_bus_pci.a 00:02:24.979 [480/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:24.979 [481/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:24.979 [482/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:24.979 [483/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:24.979 [484/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:24.979 [485/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:24.979 [486/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:24.979 [487/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:24.979 [488/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:24.979 [489/740] Linking static target lib/librte_table.a 00:02:25.240 [490/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:25.241 [491/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.241 [492/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:25.241 [493/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:25.241 [494/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:25.241 [495/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:25.241 [496/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:25.241 [497/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:25.241 [498/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:25.241 [499/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:25.241 [500/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.241 [501/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:25.241 [502/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:25.241 [503/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:25.241 [504/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:02:25.241 [505/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:02:25.241 [506/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:25.241 [507/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:25.241 [508/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:25.241 [509/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.241 [510/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:25.241 [511/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:25.241 [512/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:25.241 [513/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:25.241 [514/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:25.241 [515/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:25.241 [516/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:25.241 [517/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.241 [518/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:25.241 [519/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:25.502 [520/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:25.502 [521/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:25.502 [522/740] Linking static target lib/librte_cryptodev.a 00:02:25.502 [523/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:25.502 [524/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:25.502 [525/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:25.502 [526/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:25.502 [527/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:25.502 [528/740] Linking static target lib/librte_sched.a 00:02:25.502 [529/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:25.502 [530/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:25.502 [531/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:25.502 [532/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.502 [533/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:25.502 [534/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:25.502 [535/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:25.502 [536/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:25.502 [537/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:25.502 [538/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:25.502 [539/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:25.502 [540/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:25.502 [541/740] Linking static target lib/librte_ipsec.a 00:02:25.502 [542/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:25.502 [543/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:25.502 [544/740] Linking static target lib/librte_ethdev.a 00:02:25.763 [545/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.763 [546/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:25.763 [547/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:25.763 [548/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:25.763 [549/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:25.763 [550/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:25.763 [551/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:25.763 [552/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:25.763 [553/740] Linking static target drivers/librte_mempool_ring.a 00:02:25.763 [554/740] Linking static target lib/librte_node.a 00:02:25.763 [555/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:25.763 [556/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:25.763 [557/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:25.763 [558/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:25.763 [559/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:25.763 [560/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:25.763 [561/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:25.763 [562/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:25.763 [563/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:25.763 [564/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:25.763 [565/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:25.763 [566/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:25.763 [567/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:25.763 [568/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:25.763 [569/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:25.763 [570/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:25.763 [571/740] Linking static target lib/librte_port.a 00:02:25.763 [572/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:25.763 [573/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:25.763 [574/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:25.763 [575/740] Linking static target lib/librte_member.a 00:02:25.763 [576/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:25.763 [577/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:25.763 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:26.024 [579/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:26.024 [580/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:26.024 [581/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:26.024 [582/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:26.024 [583/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:26.024 [584/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:26.024 [585/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:26.024 [586/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.024 [587/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.024 [588/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:26.024 [589/740] Linking static target lib/librte_hash.a 00:02:26.024 [590/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.024 [591/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:26.024 [592/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:26.024 [593/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.024 [594/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:26.024 [595/740] Linking static target lib/librte_eventdev.a 00:02:26.024 [596/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:26.024 [597/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:26.024 [598/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:26.286 [599/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:26.286 [600/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:02:26.286 [601/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:26.286 [602/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:26.286 [603/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:26.286 [604/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:26.286 [605/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:26.286 [606/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:26.286 [607/740] Linking static target lib/librte_acl.a 00:02:26.286 [608/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.548 [609/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:26.548 [610/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:02:26.548 [611/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:26.548 [612/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:26.807 [613/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.807 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:26.807 [615/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.807 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:27.069 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:27.069 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.330 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:27.590 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:27.590 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:28.160 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:28.160 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:28.421 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:28.421 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:28.421 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:28.681 [627/740] Linking static target drivers/librte_net_i40e.a 00:02:29.253 [628/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.253 [629/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:29.253 [630/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:29.253 [631/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:29.513 [632/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.773 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.053 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.053 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:35.053 [636/740] Linking static target lib/librte_vhost.a 00:02:35.996 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:35.996 [638/740] Linking static target lib/librte_pipeline.a 00:02:36.257 [639/740] Linking target app/dpdk-dumpcap 00:02:36.257 [640/740] Linking target app/dpdk-proc-info 00:02:36.257 [641/740] Linking target app/dpdk-test-sad 00:02:36.257 [642/740] Linking target app/dpdk-test-regex 00:02:36.257 [643/740] Linking target app/dpdk-test-acl 00:02:36.257 [644/740] Linking target app/dpdk-test-gpudev 00:02:36.257 [645/740] Linking target app/dpdk-test-fib 00:02:36.257 [646/740] Linking target app/dpdk-test-cmdline 00:02:36.257 [647/740] Linking target app/dpdk-pdump 00:02:36.257 [648/740] Linking target app/dpdk-test-bbdev 00:02:36.257 [649/740] Linking target app/dpdk-test-flow-perf 00:02:36.257 [650/740] Linking target app/dpdk-test-compress-perf 00:02:36.257 [651/740] Linking target app/dpdk-test-pipeline 00:02:36.257 [652/740] Linking target app/dpdk-test-security-perf 00:02:36.257 [653/740] Linking target app/dpdk-test-crypto-perf 00:02:36.257 [654/740] Linking target app/dpdk-test-eventdev 00:02:36.257 [655/740] Linking target app/dpdk-testpmd 00:02:37.198 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.140 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.140 [658/740] Linking target lib/librte_eal.so.23.0 00:02:38.140 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:02:38.401 [660/740] Linking target lib/librte_meter.so.23.0 00:02:38.401 [661/740] Linking target lib/librte_ring.so.23.0 00:02:38.401 [662/740] Linking target lib/librte_timer.so.23.0 00:02:38.401 [663/740] Linking target lib/librte_cfgfile.so.23.0 00:02:38.401 [664/740] Linking target lib/librte_pci.so.23.0 00:02:38.401 [665/740] Linking target lib/librte_stack.so.23.0 00:02:38.402 [666/740] Linking target lib/librte_jobstats.so.23.0 00:02:38.402 [667/740] Linking target lib/librte_acl.so.23.0 00:02:38.402 [668/740] Linking target lib/librte_dmadev.so.23.0 00:02:38.402 [669/740] Linking target lib/librte_rawdev.so.23.0 00:02:38.402 [670/740] Linking target lib/librte_graph.so.23.0 00:02:38.402 [671/740] Linking target drivers/librte_bus_vdev.so.23.0 00:02:38.402 [672/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:02:38.402 [673/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:02:38.402 [674/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:02:38.402 [675/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:02:38.402 [676/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:02:38.402 [677/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:02:38.402 [678/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:02:38.402 [679/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:02:38.402 [680/740] Linking target lib/librte_rcu.so.23.0 00:02:38.402 [681/740] Linking target drivers/librte_bus_pci.so.23.0 00:02:38.402 [682/740] Linking target lib/librte_mempool.so.23.0 00:02:38.662 [683/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:02:38.662 [684/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:02:38.662 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:02:38.662 [686/740] Linking target lib/librte_rib.so.23.0 00:02:38.662 [687/740] Linking target drivers/librte_mempool_ring.so.23.0 00:02:38.662 [688/740] Linking target lib/librte_mbuf.so.23.0 00:02:38.923 [689/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:02:38.923 [690/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:02:38.923 [691/740] Linking target lib/librte_fib.so.23.0 00:02:38.923 [692/740] Linking target lib/librte_gpudev.so.23.0 00:02:38.923 [693/740] Linking target lib/librte_cryptodev.so.23.0 00:02:38.923 [694/740] Linking target lib/librte_bbdev.so.23.0 00:02:38.923 [695/740] Linking target lib/librte_reorder.so.23.0 00:02:38.923 [696/740] Linking target lib/librte_net.so.23.0 00:02:38.923 [697/740] Linking target lib/librte_compressdev.so.23.0 00:02:38.923 [698/740] Linking target lib/librte_distributor.so.23.0 00:02:38.923 [699/740] Linking target lib/librte_regexdev.so.23.0 00:02:38.923 [700/740] Linking target lib/librte_sched.so.23.0 00:02:38.923 [701/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:02:38.923 [702/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:02:38.923 [703/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:02:39.184 [704/740] Linking target lib/librte_security.so.23.0 00:02:39.184 [705/740] Linking target lib/librte_hash.so.23.0 00:02:39.184 [706/740] Linking target lib/librte_cmdline.so.23.0 00:02:39.184 [707/740] Linking target lib/librte_ethdev.so.23.0 00:02:39.184 [708/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:02:39.184 [709/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:02:39.184 [710/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:02:39.184 [711/740] Linking target lib/librte_metrics.so.23.0 00:02:39.184 [712/740] Linking target lib/librte_gso.so.23.0 00:02:39.184 [713/740] Linking target lib/librte_pcapng.so.23.0 00:02:39.184 [714/740] Linking target lib/librte_gro.so.23.0 00:02:39.184 [715/740] Linking target lib/librte_efd.so.23.0 00:02:39.184 [716/740] Linking target lib/librte_lpm.so.23.0 00:02:39.184 [717/740] Linking target lib/librte_bpf.so.23.0 00:02:39.184 [718/740] Linking target lib/librte_member.so.23.0 00:02:39.184 [719/740] Linking target lib/librte_ip_frag.so.23.0 00:02:39.184 [720/740] Linking target lib/librte_power.so.23.0 00:02:39.184 [721/740] Linking target lib/librte_ipsec.so.23.0 00:02:39.184 [722/740] Linking target lib/librte_eventdev.so.23.0 00:02:39.184 [723/740] Linking target lib/librte_vhost.so.23.0 00:02:39.184 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:02:39.446 [725/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:02:39.446 [726/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:02:39.446 [727/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:02:39.446 [728/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:02:39.446 [729/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:02:39.446 [730/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:02:39.446 [731/740] Linking target lib/librte_latencystats.so.23.0 00:02:39.446 [732/740] Linking target lib/librte_bitratestats.so.23.0 00:02:39.446 [733/740] Linking target lib/librte_node.so.23.0 00:02:39.446 [734/740] Linking target lib/librte_pdump.so.23.0 00:02:39.446 [735/740] Linking target lib/librte_port.so.23.0 00:02:39.706 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:02:39.706 [737/740] Linking target lib/librte_table.so.23.0 00:02:39.706 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:41.105 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.364 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:41.364 12:44:44 build_native_dpdk -- common/autobuild_common.sh@201 -- $ uname -s 00:02:41.364 12:44:44 build_native_dpdk -- common/autobuild_common.sh@201 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:41.364 12:44:44 build_native_dpdk -- common/autobuild_common.sh@214 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:41.364 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:41.364 [0/1] Installing files. 00:02:41.626 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.626 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:41.627 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:41.628 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:41.629 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.641 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.642 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:41.643 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:41.643 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.643 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.643 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.643 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.643 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.643 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.643 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.643 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.643 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.644 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:41.911 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:41.911 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:41.911 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:41.911 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:41.911 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.911 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.912 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:41.913 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:41.913 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:41.913 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:41.913 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:41.913 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:41.913 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:41.913 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:41.913 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:41.913 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:41.913 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:41.913 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:41.913 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:41.913 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:41.913 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:41.913 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:41.913 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:41.913 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:41.913 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:41.913 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:41.913 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:41.913 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:41.913 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:41.913 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:41.913 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:41.913 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:41.913 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:41.913 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:41.913 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:41.913 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:41.913 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:41.913 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:41.913 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:41.913 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:41.913 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:41.913 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:41.913 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:41.913 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:41.913 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:41.913 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:41.913 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:41.913 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:41.913 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:41.913 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:41.913 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:41.913 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:41.913 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:41.913 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:41.913 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:41.913 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:41.913 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:41.913 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:41.913 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:41.913 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:41.913 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:41.913 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:41.913 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:41.913 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:41.913 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:41.913 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:41.913 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:41.913 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:41.913 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:41.913 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:41.913 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:41.913 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:41.913 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:41.913 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:41.913 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:41.913 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:41.913 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:41.913 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:41.913 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:41.913 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:41.913 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:41.913 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:41.913 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:41.913 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:41.913 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:41.913 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:41.913 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:41.913 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:41.913 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:41.913 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:41.913 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:41.913 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:41.913 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:41.913 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:41.913 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:41.913 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:41.913 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:41.913 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:41.913 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:41.913 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:41.914 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:41.914 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:41.914 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:41.914 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:41.914 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:41.914 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:41.914 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:41.914 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:41.914 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:41.914 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:41.914 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:41.914 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:41.914 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:41.914 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:41.914 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:41.914 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:41.914 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:41.914 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:41.914 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:41.914 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:41.914 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:41.914 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:41.914 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:41.914 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:41.914 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:41.914 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:41.914 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:41.914 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:41.914 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:41.914 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:41.914 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:41.914 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:41.914 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:41.914 12:44:45 build_native_dpdk -- common/autobuild_common.sh@220 -- $ cat 00:02:41.914 12:44:45 build_native_dpdk -- common/autobuild_common.sh@225 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:41.914 00:02:41.914 real 0m27.583s 00:02:41.914 user 6m35.454s 00:02:41.914 sys 2m17.013s 00:02:41.914 12:44:45 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:41.914 12:44:45 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:41.914 ************************************ 00:02:41.914 END TEST build_native_dpdk 00:02:41.914 ************************************ 00:02:42.173 12:44:45 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:42.173 12:44:45 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:42.173 12:44:45 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:42.173 12:44:45 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:42.173 12:44:45 -- common/autobuild_common.sh@445 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:42.173 12:44:45 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:42.173 12:44:45 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:42.173 12:44:45 -- common/autotest_common.sh@10 -- $ set +x 00:02:42.173 ************************************ 00:02:42.173 START TEST autobuild_llvm_precompile 00:02:42.173 ************************************ 00:02:42.173 12:44:45 autobuild_llvm_precompile -- common/autotest_common.sh@1129 -- $ _llvm_precompile 00:02:42.173 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:42.433 Target: x86_64-redhat-linux-gnu 00:02:42.433 Thread model: posix 00:02:42.433 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:42.433 12:44:45 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:42.693 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:42.954 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:42.954 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:42.954 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:43.896 Using 'verbs' RDMA provider 00:03:00.221 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:15.125 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:15.125 Creating mk/config.mk...done. 00:03:15.125 Creating mk/cc.flags.mk...done. 00:03:15.125 Type 'make' to build. 00:03:15.125 00:03:15.125 real 0m32.438s 00:03:15.125 user 0m13.246s 00:03:15.125 sys 0m18.556s 00:03:15.125 12:45:17 autobuild_llvm_precompile -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:03:15.125 12:45:17 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:03:15.125 ************************************ 00:03:15.125 END TEST autobuild_llvm_precompile 00:03:15.125 ************************************ 00:03:15.125 12:45:17 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:15.125 12:45:17 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:15.125 12:45:17 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:15.125 12:45:17 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:03:15.125 12:45:17 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:15.125 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:15.125 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:15.125 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:15.125 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:15.386 Using 'verbs' RDMA provider 00:03:28.572 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:40.807 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:40.807 Creating mk/config.mk...done. 00:03:40.807 Creating mk/cc.flags.mk...done. 00:03:40.807 Type 'make' to build. 00:03:40.807 12:45:43 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:40.807 12:45:43 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:03:40.807 12:45:43 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:03:40.807 12:45:43 -- common/autotest_common.sh@10 -- $ set +x 00:03:40.807 ************************************ 00:03:40.807 START TEST make 00:03:40.807 ************************************ 00:03:40.807 12:45:43 make -- common/autotest_common.sh@1129 -- $ make -j112 00:03:41.067 make[1]: Nothing to be done for 'all'. 00:03:42.976 The Meson build system 00:03:42.976 Version: 1.5.0 00:03:42.976 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:42.976 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:42.976 Build type: native build 00:03:42.976 Project name: libvfio-user 00:03:42.976 Project version: 0.0.1 00:03:42.976 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:42.976 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:42.976 Host machine cpu family: x86_64 00:03:42.976 Host machine cpu: x86_64 00:03:42.976 Run-time dependency threads found: YES 00:03:42.976 Library dl found: YES 00:03:42.976 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:42.976 Run-time dependency json-c found: YES 0.17 00:03:42.976 Run-time dependency cmocka found: YES 1.1.7 00:03:42.976 Program pytest-3 found: NO 00:03:42.976 Program flake8 found: NO 00:03:42.976 Program misspell-fixer found: NO 00:03:42.976 Program restructuredtext-lint found: NO 00:03:42.976 Program valgrind found: YES (/usr/bin/valgrind) 00:03:42.976 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:42.976 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:42.976 Compiler for C supports arguments -Wwrite-strings: YES 00:03:42.977 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:42.977 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:42.977 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:42.977 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:42.977 Build targets in project: 8 00:03:42.977 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:42.977 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:42.977 00:03:42.977 libvfio-user 0.0.1 00:03:42.977 00:03:42.977 User defined options 00:03:42.977 buildtype : debug 00:03:42.977 default_library: static 00:03:42.977 libdir : /usr/local/lib 00:03:42.977 00:03:42.977 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:43.236 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:43.236 [1/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:43.236 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:43.236 [3/36] Compiling C object samples/null.p/null.c.o 00:03:43.236 [4/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:43.236 [5/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:43.236 [6/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:43.236 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:43.236 [8/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:43.236 [9/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:43.236 [10/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:43.236 [11/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:43.236 [12/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:43.236 [13/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:43.236 [14/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:43.236 [15/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:43.236 [16/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:43.236 [17/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:43.236 [18/36] Compiling C object samples/server.p/server.c.o 00:03:43.236 [19/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:43.236 [20/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:43.236 [21/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:43.236 [22/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:43.236 [23/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:43.236 [24/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:43.236 [25/36] Compiling C object samples/client.p/client.c.o 00:03:43.236 [26/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:43.236 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:43.236 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:43.236 [29/36] Linking target samples/client 00:03:43.237 [30/36] Linking static target lib/libvfio-user.a 00:03:43.237 [31/36] Linking target test/unit_tests 00:03:43.504 [32/36] Linking target samples/gpio-pci-idio-16 00:03:43.505 [33/36] Linking target samples/lspci 00:03:43.505 [34/36] Linking target samples/shadow_ioeventfd_server 00:03:43.505 [35/36] Linking target samples/null 00:03:43.505 [36/36] Linking target samples/server 00:03:43.505 INFO: autodetecting backend as ninja 00:03:43.505 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:43.505 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:43.766 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:43.766 ninja: no work to do. 00:03:56.002 CC lib/ut/ut.o 00:03:56.002 CC lib/log/log.o 00:03:56.002 CC lib/ut_mock/mock.o 00:03:56.002 CC lib/log/log_flags.o 00:03:56.262 CC lib/log/log_deprecated.o 00:03:56.262 LIB libspdk_ut.a 00:03:56.262 LIB libspdk_ut_mock.a 00:03:56.262 LIB libspdk_log.a 00:03:56.523 CC lib/ioat/ioat.o 00:03:56.523 CXX lib/trace_parser/trace.o 00:03:56.523 CC lib/util/base64.o 00:03:56.523 CC lib/util/bit_array.o 00:03:56.523 CC lib/util/crc16.o 00:03:56.523 CC lib/util/cpuset.o 00:03:56.523 CC lib/util/crc32.o 00:03:56.523 CC lib/util/crc32c.o 00:03:56.523 CC lib/dma/dma.o 00:03:56.523 CC lib/util/crc32_ieee.o 00:03:56.523 CC lib/util/crc64.o 00:03:56.523 CC lib/util/dif.o 00:03:56.523 CC lib/util/fd.o 00:03:56.523 CC lib/util/fd_group.o 00:03:56.523 CC lib/util/file.o 00:03:56.523 CC lib/util/hexlify.o 00:03:56.523 CC lib/util/iov.o 00:03:56.523 CC lib/util/math.o 00:03:56.523 CC lib/util/net.o 00:03:56.523 CC lib/util/pipe.o 00:03:56.523 CC lib/util/strerror_tls.o 00:03:56.523 CC lib/util/string.o 00:03:56.523 CC lib/util/uuid.o 00:03:56.523 CC lib/util/xor.o 00:03:56.523 CC lib/util/md5.o 00:03:56.523 CC lib/util/zipf.o 00:03:56.783 CC lib/vfio_user/host/vfio_user.o 00:03:56.783 CC lib/vfio_user/host/vfio_user_pci.o 00:03:56.783 LIB libspdk_dma.a 00:03:56.783 LIB libspdk_ioat.a 00:03:56.783 LIB libspdk_vfio_user.a 00:03:57.043 LIB libspdk_util.a 00:03:57.303 CC lib/env_dpdk/env.o 00:03:57.303 CC lib/env_dpdk/memory.o 00:03:57.303 CC lib/conf/conf.o 00:03:57.303 CC lib/env_dpdk/pci.o 00:03:57.303 CC lib/env_dpdk/pci_ioat.o 00:03:57.303 CC lib/env_dpdk/init.o 00:03:57.303 CC lib/env_dpdk/threads.o 00:03:57.303 CC lib/env_dpdk/pci_virtio.o 00:03:57.303 LIB libspdk_trace_parser.a 00:03:57.303 CC lib/env_dpdk/pci_idxd.o 00:03:57.303 CC lib/env_dpdk/pci_vmd.o 00:03:57.303 CC lib/env_dpdk/pci_event.o 00:03:57.303 CC lib/env_dpdk/sigbus_handler.o 00:03:57.303 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:57.303 CC lib/idxd/idxd.o 00:03:57.303 CC lib/env_dpdk/pci_dpdk.o 00:03:57.303 CC lib/idxd/idxd_user.o 00:03:57.303 CC lib/idxd/idxd_kernel.o 00:03:57.303 CC lib/vmd/vmd.o 00:03:57.303 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:57.303 CC lib/vmd/led.o 00:03:57.303 CC lib/json/json_util.o 00:03:57.303 CC lib/json/json_parse.o 00:03:57.303 CC lib/json/json_write.o 00:03:57.303 CC lib/rdma_utils/rdma_utils.o 00:03:57.303 LIB libspdk_conf.a 00:03:57.564 LIB libspdk_rdma_utils.a 00:03:57.564 LIB libspdk_json.a 00:03:57.564 LIB libspdk_idxd.a 00:03:57.564 LIB libspdk_vmd.a 00:03:57.824 CC lib/rdma_provider/common.o 00:03:57.824 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:57.824 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:57.824 CC lib/jsonrpc/jsonrpc_server.o 00:03:57.824 CC lib/jsonrpc/jsonrpc_client.o 00:03:57.824 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:57.824 LIB libspdk_rdma_provider.a 00:03:57.824 LIB libspdk_jsonrpc.a 00:03:58.085 LIB libspdk_env_dpdk.a 00:03:58.346 CC lib/rpc/rpc.o 00:03:58.346 LIB libspdk_rpc.a 00:03:58.606 CC lib/notify/notify.o 00:03:58.606 CC lib/notify/notify_rpc.o 00:03:58.606 CC lib/keyring/keyring.o 00:03:58.606 CC lib/keyring/keyring_rpc.o 00:03:58.606 CC lib/trace/trace.o 00:03:58.606 CC lib/trace/trace_flags.o 00:03:58.606 CC lib/trace/trace_rpc.o 00:03:58.867 LIB libspdk_notify.a 00:03:58.867 LIB libspdk_keyring.a 00:03:58.867 LIB libspdk_trace.a 00:03:59.128 CC lib/thread/thread.o 00:03:59.128 CC lib/thread/iobuf.o 00:03:59.128 CC lib/sock/sock.o 00:03:59.128 CC lib/sock/sock_rpc.o 00:03:59.389 LIB libspdk_sock.a 00:03:59.649 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:59.649 CC lib/nvme/nvme_ctrlr.o 00:03:59.649 CC lib/nvme/nvme_fabric.o 00:03:59.649 CC lib/nvme/nvme_ns_cmd.o 00:03:59.649 CC lib/nvme/nvme_ns.o 00:03:59.649 CC lib/nvme/nvme_pcie_common.o 00:03:59.649 CC lib/nvme/nvme_pcie.o 00:03:59.649 CC lib/nvme/nvme_qpair.o 00:03:59.649 CC lib/nvme/nvme.o 00:03:59.649 CC lib/nvme/nvme_quirks.o 00:03:59.649 CC lib/nvme/nvme_transport.o 00:03:59.649 CC lib/nvme/nvme_discovery.o 00:03:59.649 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:59.649 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:59.649 CC lib/nvme/nvme_tcp.o 00:03:59.649 CC lib/nvme/nvme_opal.o 00:03:59.649 CC lib/nvme/nvme_io_msg.o 00:03:59.649 CC lib/nvme/nvme_poll_group.o 00:03:59.649 CC lib/nvme/nvme_zns.o 00:03:59.649 CC lib/nvme/nvme_auth.o 00:03:59.649 CC lib/nvme/nvme_stubs.o 00:03:59.649 CC lib/nvme/nvme_cuse.o 00:03:59.649 CC lib/nvme/nvme_vfio_user.o 00:03:59.649 CC lib/nvme/nvme_rdma.o 00:03:59.908 LIB libspdk_thread.a 00:04:00.167 CC lib/blob/blobstore.o 00:04:00.167 CC lib/blob/request.o 00:04:00.167 CC lib/blob/zeroes.o 00:04:00.167 CC lib/blob/blob_bs_dev.o 00:04:00.167 CC lib/fsdev/fsdev.o 00:04:00.167 CC lib/fsdev/fsdev_rpc.o 00:04:00.167 CC lib/fsdev/fsdev_io.o 00:04:00.167 CC lib/init/json_config.o 00:04:00.167 CC lib/init/subsystem_rpc.o 00:04:00.167 CC lib/init/subsystem.o 00:04:00.167 CC lib/init/rpc.o 00:04:00.167 CC lib/vfu_tgt/tgt_endpoint.o 00:04:00.167 CC lib/vfu_tgt/tgt_rpc.o 00:04:00.167 CC lib/accel/accel.o 00:04:00.167 CC lib/accel/accel_rpc.o 00:04:00.167 CC lib/accel/accel_sw.o 00:04:00.167 CC lib/virtio/virtio.o 00:04:00.167 CC lib/virtio/virtio_vhost_user.o 00:04:00.167 CC lib/virtio/virtio_vfio_user.o 00:04:00.167 CC lib/virtio/virtio_pci.o 00:04:00.427 LIB libspdk_init.a 00:04:00.428 LIB libspdk_vfu_tgt.a 00:04:00.428 LIB libspdk_virtio.a 00:04:00.689 LIB libspdk_fsdev.a 00:04:00.689 CC lib/event/app.o 00:04:00.689 CC lib/event/reactor.o 00:04:00.689 CC lib/event/log_rpc.o 00:04:00.689 CC lib/event/app_rpc.o 00:04:00.689 CC lib/event/scheduler_static.o 00:04:00.949 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:04:00.949 LIB libspdk_event.a 00:04:00.949 LIB libspdk_accel.a 00:04:00.949 LIB libspdk_nvme.a 00:04:01.209 LIB libspdk_fuse_dispatcher.a 00:04:01.209 CC lib/bdev/bdev.o 00:04:01.209 CC lib/bdev/bdev_rpc.o 00:04:01.209 CC lib/bdev/bdev_zone.o 00:04:01.209 CC lib/bdev/part.o 00:04:01.209 CC lib/bdev/scsi_nvme.o 00:04:02.152 LIB libspdk_blob.a 00:04:02.152 CC lib/blobfs/blobfs.o 00:04:02.152 CC lib/lvol/lvol.o 00:04:02.152 CC lib/blobfs/tree.o 00:04:02.722 LIB libspdk_lvol.a 00:04:02.722 LIB libspdk_blobfs.a 00:04:02.983 LIB libspdk_bdev.a 00:04:03.242 CC lib/scsi/dev.o 00:04:03.242 CC lib/ftl/ftl_core.o 00:04:03.242 CC lib/ftl/ftl_init.o 00:04:03.242 CC lib/scsi/lun.o 00:04:03.242 CC lib/ftl/ftl_layout.o 00:04:03.242 CC lib/ftl/ftl_debug.o 00:04:03.242 CC lib/scsi/port.o 00:04:03.242 CC lib/ftl/ftl_io.o 00:04:03.242 CC lib/ftl/ftl_sb.o 00:04:03.242 CC lib/scsi/scsi.o 00:04:03.242 CC lib/scsi/scsi_bdev.o 00:04:03.242 CC lib/ftl/ftl_l2p.o 00:04:03.242 CC lib/ftl/ftl_l2p_flat.o 00:04:03.242 CC lib/scsi/scsi_pr.o 00:04:03.242 CC lib/ftl/ftl_nv_cache.o 00:04:03.242 CC lib/nvmf/ctrlr.o 00:04:03.242 CC lib/nvmf/ctrlr_discovery.o 00:04:03.242 CC lib/ftl/ftl_band.o 00:04:03.242 CC lib/scsi/scsi_rpc.o 00:04:03.242 CC lib/nvmf/subsystem.o 00:04:03.242 CC lib/nvmf/ctrlr_bdev.o 00:04:03.242 CC lib/scsi/task.o 00:04:03.242 CC lib/nbd/nbd.o 00:04:03.242 CC lib/ftl/ftl_band_ops.o 00:04:03.242 CC lib/ftl/ftl_writer.o 00:04:03.242 CC lib/nbd/nbd_rpc.o 00:04:03.242 CC lib/ftl/ftl_rq.o 00:04:03.242 CC lib/nvmf/nvmf.o 00:04:03.242 CC lib/ublk/ublk_rpc.o 00:04:03.242 CC lib/nvmf/transport.o 00:04:03.242 CC lib/nvmf/nvmf_rpc.o 00:04:03.242 CC lib/ftl/ftl_reloc.o 00:04:03.242 CC lib/ublk/ublk.o 00:04:03.242 CC lib/nvmf/tcp.o 00:04:03.242 CC lib/ftl/ftl_l2p_cache.o 00:04:03.242 CC lib/ftl/ftl_p2l.o 00:04:03.242 CC lib/nvmf/stubs.o 00:04:03.242 CC lib/nvmf/mdns_server.o 00:04:03.242 CC lib/ftl/ftl_p2l_log.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt.o 00:04:03.242 CC lib/nvmf/vfio_user.o 00:04:03.242 CC lib/nvmf/rdma.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:04:03.242 CC lib/nvmf/auth.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_startup.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_md.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_misc.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_band.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:04:03.242 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:04:03.242 CC lib/ftl/utils/ftl_md.o 00:04:03.242 CC lib/ftl/utils/ftl_conf.o 00:04:03.242 CC lib/ftl/utils/ftl_bitmap.o 00:04:03.242 CC lib/ftl/utils/ftl_mempool.o 00:04:03.242 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:04:03.242 CC lib/ftl/utils/ftl_property.o 00:04:03.242 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:04:03.501 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:04:03.501 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:04:03.501 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:04:03.501 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:04:03.501 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:04:03.501 CC lib/ftl/upgrade/ftl_sb_v3.o 00:04:03.501 CC lib/ftl/upgrade/ftl_sb_v5.o 00:04:03.501 CC lib/ftl/nvc/ftl_nvc_dev.o 00:04:03.501 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:04:03.501 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:04:03.501 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:04:03.501 CC lib/ftl/base/ftl_base_dev.o 00:04:03.501 CC lib/ftl/base/ftl_base_bdev.o 00:04:03.501 CC lib/ftl/ftl_trace.o 00:04:03.759 LIB libspdk_nbd.a 00:04:03.759 LIB libspdk_scsi.a 00:04:03.759 LIB libspdk_ublk.a 00:04:04.018 LIB libspdk_ftl.a 00:04:04.018 CC lib/iscsi/conn.o 00:04:04.018 CC lib/iscsi/init_grp.o 00:04:04.018 CC lib/iscsi/iscsi.o 00:04:04.018 CC lib/iscsi/param.o 00:04:04.018 CC lib/iscsi/portal_grp.o 00:04:04.018 CC lib/iscsi/tgt_node.o 00:04:04.018 CC lib/iscsi/iscsi_subsystem.o 00:04:04.018 CC lib/iscsi/iscsi_rpc.o 00:04:04.018 CC lib/iscsi/task.o 00:04:04.018 CC lib/vhost/vhost.o 00:04:04.018 CC lib/vhost/vhost_scsi.o 00:04:04.018 CC lib/vhost/vhost_rpc.o 00:04:04.018 CC lib/vhost/vhost_blk.o 00:04:04.018 CC lib/vhost/rte_vhost_user.o 00:04:04.589 LIB libspdk_nvmf.a 00:04:04.589 LIB libspdk_vhost.a 00:04:04.850 LIB libspdk_iscsi.a 00:04:05.110 CC module/env_dpdk/env_dpdk_rpc.o 00:04:05.369 CC module/vfu_device/vfu_virtio.o 00:04:05.369 CC module/vfu_device/vfu_virtio_blk.o 00:04:05.369 CC module/vfu_device/vfu_virtio_scsi.o 00:04:05.369 CC module/vfu_device/vfu_virtio_rpc.o 00:04:05.369 CC module/vfu_device/vfu_virtio_fs.o 00:04:05.369 LIB libspdk_env_dpdk_rpc.a 00:04:05.369 CC module/keyring/file/keyring.o 00:04:05.369 CC module/keyring/file/keyring_rpc.o 00:04:05.369 CC module/fsdev/aio/fsdev_aio.o 00:04:05.369 CC module/fsdev/aio/fsdev_aio_rpc.o 00:04:05.369 CC module/fsdev/aio/linux_aio_mgr.o 00:04:05.369 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:05.369 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:05.369 CC module/accel/iaa/accel_iaa.o 00:04:05.369 CC module/accel/iaa/accel_iaa_rpc.o 00:04:05.369 CC module/accel/ioat/accel_ioat.o 00:04:05.369 CC module/keyring/linux/keyring_rpc.o 00:04:05.369 CC module/keyring/linux/keyring.o 00:04:05.369 CC module/accel/ioat/accel_ioat_rpc.o 00:04:05.369 CC module/scheduler/gscheduler/gscheduler.o 00:04:05.369 CC module/sock/posix/posix.o 00:04:05.369 CC module/accel/dsa/accel_dsa.o 00:04:05.369 CC module/accel/dsa/accel_dsa_rpc.o 00:04:05.369 CC module/accel/error/accel_error.o 00:04:05.369 CC module/accel/error/accel_error_rpc.o 00:04:05.369 CC module/blob/bdev/blob_bdev.o 00:04:05.369 LIB libspdk_keyring_file.a 00:04:05.369 LIB libspdk_keyring_linux.a 00:04:05.369 LIB libspdk_scheduler_dpdk_governor.a 00:04:05.369 LIB libspdk_scheduler_gscheduler.a 00:04:05.630 LIB libspdk_scheduler_dynamic.a 00:04:05.630 LIB libspdk_accel_ioat.a 00:04:05.630 LIB libspdk_accel_iaa.a 00:04:05.630 LIB libspdk_accel_error.a 00:04:05.630 LIB libspdk_blob_bdev.a 00:04:05.630 LIB libspdk_accel_dsa.a 00:04:05.630 LIB libspdk_vfu_device.a 00:04:05.891 LIB libspdk_fsdev_aio.a 00:04:05.891 LIB libspdk_sock_posix.a 00:04:05.891 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:05.891 CC module/bdev/lvol/vbdev_lvol.o 00:04:05.891 CC module/blobfs/bdev/blobfs_bdev.o 00:04:05.891 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:05.891 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:05.891 CC module/bdev/error/vbdev_error.o 00:04:05.891 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:05.891 CC module/bdev/error/vbdev_error_rpc.o 00:04:05.891 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:05.891 CC module/bdev/nvme/bdev_nvme.o 00:04:05.891 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:05.891 CC module/bdev/delay/vbdev_delay.o 00:04:05.891 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:05.891 CC module/bdev/ftl/bdev_ftl.o 00:04:05.891 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:05.891 CC module/bdev/nvme/nvme_rpc.o 00:04:06.153 CC module/bdev/nvme/vbdev_opal.o 00:04:06.153 CC module/bdev/nvme/bdev_mdns_client.o 00:04:06.153 CC module/bdev/malloc/bdev_malloc.o 00:04:06.153 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:06.153 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:06.153 CC module/bdev/gpt/vbdev_gpt.o 00:04:06.153 CC module/bdev/iscsi/bdev_iscsi.o 00:04:06.153 CC module/bdev/gpt/gpt.o 00:04:06.153 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:06.153 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:06.153 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:06.153 CC module/bdev/split/vbdev_split.o 00:04:06.153 CC module/bdev/passthru/vbdev_passthru.o 00:04:06.153 CC module/bdev/split/vbdev_split_rpc.o 00:04:06.153 CC module/bdev/null/bdev_null.o 00:04:06.153 CC module/bdev/raid/bdev_raid.o 00:04:06.153 CC module/bdev/null/bdev_null_rpc.o 00:04:06.153 CC module/bdev/raid/bdev_raid_rpc.o 00:04:06.153 CC module/bdev/raid/bdev_raid_sb.o 00:04:06.153 CC module/bdev/raid/raid1.o 00:04:06.153 CC module/bdev/raid/raid0.o 00:04:06.153 CC module/bdev/raid/concat.o 00:04:06.153 CC module/bdev/aio/bdev_aio.o 00:04:06.153 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:06.153 CC module/bdev/aio/bdev_aio_rpc.o 00:04:06.153 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:06.153 LIB libspdk_blobfs_bdev.a 00:04:06.153 LIB libspdk_bdev_split.a 00:04:06.153 LIB libspdk_bdev_error.a 00:04:06.153 LIB libspdk_bdev_gpt.a 00:04:06.153 LIB libspdk_bdev_ftl.a 00:04:06.153 LIB libspdk_bdev_null.a 00:04:06.153 LIB libspdk_bdev_passthru.a 00:04:06.153 LIB libspdk_bdev_iscsi.a 00:04:06.414 LIB libspdk_bdev_zone_block.a 00:04:06.414 LIB libspdk_bdev_aio.a 00:04:06.414 LIB libspdk_bdev_malloc.a 00:04:06.414 LIB libspdk_bdev_delay.a 00:04:06.414 LIB libspdk_bdev_lvol.a 00:04:06.414 LIB libspdk_bdev_virtio.a 00:04:06.676 LIB libspdk_bdev_raid.a 00:04:07.618 LIB libspdk_bdev_nvme.a 00:04:08.189 CC module/event/subsystems/scheduler/scheduler.o 00:04:08.189 CC module/event/subsystems/vmd/vmd.o 00:04:08.189 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:08.189 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:04:08.189 CC module/event/subsystems/iobuf/iobuf.o 00:04:08.189 CC module/event/subsystems/keyring/keyring.o 00:04:08.189 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:08.189 CC module/event/subsystems/sock/sock.o 00:04:08.189 CC module/event/subsystems/fsdev/fsdev.o 00:04:08.189 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:08.189 LIB libspdk_event_vmd.a 00:04:08.189 LIB libspdk_event_keyring.a 00:04:08.189 LIB libspdk_event_vfu_tgt.a 00:04:08.189 LIB libspdk_event_fsdev.a 00:04:08.189 LIB libspdk_event_iobuf.a 00:04:08.190 LIB libspdk_event_sock.a 00:04:08.190 LIB libspdk_event_vhost_blk.a 00:04:08.190 LIB libspdk_event_scheduler.a 00:04:08.451 CC module/event/subsystems/accel/accel.o 00:04:08.713 LIB libspdk_event_accel.a 00:04:08.974 CC module/event/subsystems/bdev/bdev.o 00:04:09.234 LIB libspdk_event_bdev.a 00:04:09.495 CC module/event/subsystems/ublk/ublk.o 00:04:09.495 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:09.495 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:09.495 CC module/event/subsystems/scsi/scsi.o 00:04:09.495 CC module/event/subsystems/nbd/nbd.o 00:04:09.495 LIB libspdk_event_ublk.a 00:04:09.495 LIB libspdk_event_nbd.a 00:04:09.495 LIB libspdk_event_scsi.a 00:04:09.495 LIB libspdk_event_nvmf.a 00:04:09.757 CC module/event/subsystems/iscsi/iscsi.o 00:04:09.757 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:10.018 LIB libspdk_event_vhost_scsi.a 00:04:10.018 LIB libspdk_event_iscsi.a 00:04:10.278 CXX app/trace/trace.o 00:04:10.278 CC app/spdk_top/spdk_top.o 00:04:10.278 CC app/spdk_nvme_discover/discovery_aer.o 00:04:10.278 CC app/spdk_lspci/spdk_lspci.o 00:04:10.278 CC app/spdk_nvme_identify/identify.o 00:04:10.278 CC app/trace_record/trace_record.o 00:04:10.278 TEST_HEADER include/spdk/accel.h 00:04:10.278 CC app/spdk_nvme_perf/perf.o 00:04:10.278 TEST_HEADER include/spdk/assert.h 00:04:10.278 TEST_HEADER include/spdk/accel_module.h 00:04:10.278 TEST_HEADER include/spdk/barrier.h 00:04:10.278 TEST_HEADER include/spdk/bdev.h 00:04:10.278 TEST_HEADER include/spdk/base64.h 00:04:10.278 TEST_HEADER include/spdk/bdev_module.h 00:04:10.278 TEST_HEADER include/spdk/bdev_zone.h 00:04:10.278 TEST_HEADER include/spdk/bit_array.h 00:04:10.278 TEST_HEADER include/spdk/bit_pool.h 00:04:10.278 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:10.278 TEST_HEADER include/spdk/blob_bdev.h 00:04:10.278 TEST_HEADER include/spdk/blobfs.h 00:04:10.278 TEST_HEADER include/spdk/blob.h 00:04:10.278 CC test/rpc_client/rpc_client_test.o 00:04:10.278 TEST_HEADER include/spdk/crc16.h 00:04:10.278 TEST_HEADER include/spdk/cpuset.h 00:04:10.278 TEST_HEADER include/spdk/conf.h 00:04:10.278 TEST_HEADER include/spdk/config.h 00:04:10.278 TEST_HEADER include/spdk/crc32.h 00:04:10.278 TEST_HEADER include/spdk/endian.h 00:04:10.278 TEST_HEADER include/spdk/crc64.h 00:04:10.278 TEST_HEADER include/spdk/dma.h 00:04:10.278 TEST_HEADER include/spdk/dif.h 00:04:10.278 TEST_HEADER include/spdk/env.h 00:04:10.278 TEST_HEADER include/spdk/env_dpdk.h 00:04:10.278 TEST_HEADER include/spdk/event.h 00:04:10.278 TEST_HEADER include/spdk/fd_group.h 00:04:10.278 TEST_HEADER include/spdk/fsdev.h 00:04:10.278 TEST_HEADER include/spdk/fd.h 00:04:10.278 TEST_HEADER include/spdk/fsdev_module.h 00:04:10.278 TEST_HEADER include/spdk/file.h 00:04:10.278 TEST_HEADER include/spdk/ftl.h 00:04:10.278 TEST_HEADER include/spdk/fuse_dispatcher.h 00:04:10.278 TEST_HEADER include/spdk/histogram_data.h 00:04:10.278 CC app/spdk_dd/spdk_dd.o 00:04:10.278 TEST_HEADER include/spdk/gpt_spec.h 00:04:10.278 TEST_HEADER include/spdk/hexlify.h 00:04:10.278 TEST_HEADER include/spdk/idxd.h 00:04:10.278 TEST_HEADER include/spdk/idxd_spec.h 00:04:10.278 TEST_HEADER include/spdk/init.h 00:04:10.278 TEST_HEADER include/spdk/iscsi_spec.h 00:04:10.278 TEST_HEADER include/spdk/ioat.h 00:04:10.278 TEST_HEADER include/spdk/ioat_spec.h 00:04:10.278 TEST_HEADER include/spdk/jsonrpc.h 00:04:10.278 TEST_HEADER include/spdk/json.h 00:04:10.278 CC app/nvmf_tgt/nvmf_main.o 00:04:10.278 TEST_HEADER include/spdk/keyring.h 00:04:10.278 CC app/iscsi_tgt/iscsi_tgt.o 00:04:10.278 TEST_HEADER include/spdk/log.h 00:04:10.278 TEST_HEADER include/spdk/keyring_module.h 00:04:10.278 TEST_HEADER include/spdk/likely.h 00:04:10.278 TEST_HEADER include/spdk/memory.h 00:04:10.278 TEST_HEADER include/spdk/mmio.h 00:04:10.278 TEST_HEADER include/spdk/md5.h 00:04:10.278 TEST_HEADER include/spdk/lvol.h 00:04:10.278 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:10.278 TEST_HEADER include/spdk/nbd.h 00:04:10.278 TEST_HEADER include/spdk/nvme_intel.h 00:04:10.278 TEST_HEADER include/spdk/net.h 00:04:10.278 TEST_HEADER include/spdk/notify.h 00:04:10.278 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:10.278 TEST_HEADER include/spdk/nvme.h 00:04:10.278 TEST_HEADER include/spdk/nvme_spec.h 00:04:10.278 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:10.278 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:10.278 TEST_HEADER include/spdk/nvme_zns.h 00:04:10.278 CC app/spdk_tgt/spdk_tgt.o 00:04:10.278 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:10.278 TEST_HEADER include/spdk/nvmf_spec.h 00:04:10.278 TEST_HEADER include/spdk/nvmf.h 00:04:10.278 TEST_HEADER include/spdk/opal.h 00:04:10.278 TEST_HEADER include/spdk/nvmf_transport.h 00:04:10.278 TEST_HEADER include/spdk/opal_spec.h 00:04:10.278 TEST_HEADER include/spdk/pci_ids.h 00:04:10.278 TEST_HEADER include/spdk/queue.h 00:04:10.278 TEST_HEADER include/spdk/pipe.h 00:04:10.278 TEST_HEADER include/spdk/rpc.h 00:04:10.278 TEST_HEADER include/spdk/scheduler.h 00:04:10.278 TEST_HEADER include/spdk/reduce.h 00:04:10.278 TEST_HEADER include/spdk/scsi.h 00:04:10.278 TEST_HEADER include/spdk/scsi_spec.h 00:04:10.278 TEST_HEADER include/spdk/sock.h 00:04:10.278 TEST_HEADER include/spdk/stdinc.h 00:04:10.542 TEST_HEADER include/spdk/string.h 00:04:10.542 TEST_HEADER include/spdk/trace.h 00:04:10.542 TEST_HEADER include/spdk/thread.h 00:04:10.542 TEST_HEADER include/spdk/tree.h 00:04:10.542 TEST_HEADER include/spdk/trace_parser.h 00:04:10.542 TEST_HEADER include/spdk/ublk.h 00:04:10.542 TEST_HEADER include/spdk/util.h 00:04:10.542 TEST_HEADER include/spdk/uuid.h 00:04:10.542 TEST_HEADER include/spdk/version.h 00:04:10.542 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:10.542 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:10.542 TEST_HEADER include/spdk/vhost.h 00:04:10.542 TEST_HEADER include/spdk/vmd.h 00:04:10.542 TEST_HEADER include/spdk/zipf.h 00:04:10.542 TEST_HEADER include/spdk/xor.h 00:04:10.542 CXX test/cpp_headers/accel.o 00:04:10.542 CXX test/cpp_headers/accel_module.o 00:04:10.542 CXX test/cpp_headers/assert.o 00:04:10.542 CXX test/cpp_headers/barrier.o 00:04:10.542 CXX test/cpp_headers/base64.o 00:04:10.542 CXX test/cpp_headers/bdev_module.o 00:04:10.542 CXX test/cpp_headers/bdev.o 00:04:10.542 CXX test/cpp_headers/bdev_zone.o 00:04:10.542 CXX test/cpp_headers/blobfs_bdev.o 00:04:10.542 CXX test/cpp_headers/bit_pool.o 00:04:10.542 CXX test/cpp_headers/bit_array.o 00:04:10.542 CXX test/cpp_headers/blob_bdev.o 00:04:10.542 CXX test/cpp_headers/blobfs.o 00:04:10.542 CXX test/cpp_headers/blob.o 00:04:10.542 CXX test/cpp_headers/config.o 00:04:10.542 CXX test/cpp_headers/conf.o 00:04:10.542 CXX test/cpp_headers/crc16.o 00:04:10.542 CXX test/cpp_headers/crc64.o 00:04:10.542 CXX test/cpp_headers/cpuset.o 00:04:10.542 CXX test/cpp_headers/dif.o 00:04:10.542 CXX test/cpp_headers/crc32.o 00:04:10.542 CXX test/cpp_headers/endian.o 00:04:10.542 CXX test/cpp_headers/env_dpdk.o 00:04:10.542 CXX test/cpp_headers/dma.o 00:04:10.542 CXX test/cpp_headers/event.o 00:04:10.542 CXX test/cpp_headers/env.o 00:04:10.542 CXX test/cpp_headers/fd.o 00:04:10.542 CXX test/cpp_headers/fd_group.o 00:04:10.542 CXX test/cpp_headers/file.o 00:04:10.542 CXX test/cpp_headers/fsdev_module.o 00:04:10.542 CXX test/cpp_headers/fsdev.o 00:04:10.542 CXX test/cpp_headers/ftl.o 00:04:10.542 CXX test/cpp_headers/fuse_dispatcher.o 00:04:10.542 CXX test/cpp_headers/gpt_spec.o 00:04:10.542 CXX test/cpp_headers/hexlify.o 00:04:10.542 CXX test/cpp_headers/histogram_data.o 00:04:10.542 CXX test/cpp_headers/idxd.o 00:04:10.543 CXX test/cpp_headers/init.o 00:04:10.543 CXX test/cpp_headers/idxd_spec.o 00:04:10.543 CXX test/cpp_headers/ioat.o 00:04:10.543 CXX test/cpp_headers/ioat_spec.o 00:04:10.543 CXX test/cpp_headers/iscsi_spec.o 00:04:10.543 CXX test/cpp_headers/json.o 00:04:10.543 CXX test/cpp_headers/keyring.o 00:04:10.543 CXX test/cpp_headers/keyring_module.o 00:04:10.543 CC test/thread/poller_perf/poller_perf.o 00:04:10.543 CXX test/cpp_headers/jsonrpc.o 00:04:10.543 CXX test/cpp_headers/likely.o 00:04:10.543 CXX test/cpp_headers/log.o 00:04:10.543 CXX test/cpp_headers/lvol.o 00:04:10.543 CXX test/cpp_headers/md5.o 00:04:10.543 CXX test/cpp_headers/memory.o 00:04:10.543 CC examples/ioat/perf/perf.o 00:04:10.543 CC examples/util/zipf/zipf.o 00:04:10.543 CXX test/cpp_headers/mmio.o 00:04:10.543 CXX test/cpp_headers/nbd.o 00:04:10.543 CXX test/cpp_headers/net.o 00:04:10.543 CXX test/cpp_headers/nvme.o 00:04:10.543 CXX test/cpp_headers/notify.o 00:04:10.543 CXX test/cpp_headers/nvme_intel.o 00:04:10.543 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:10.543 CXX test/cpp_headers/nvme_spec.o 00:04:10.543 CXX test/cpp_headers/nvme_ocssd.o 00:04:10.543 CXX test/cpp_headers/nvmf_cmd.o 00:04:10.543 CXX test/cpp_headers/nvme_zns.o 00:04:10.543 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:10.543 CXX test/cpp_headers/nvmf_spec.o 00:04:10.543 CXX test/cpp_headers/nvmf.o 00:04:10.543 CXX test/cpp_headers/opal.o 00:04:10.543 CXX test/cpp_headers/nvmf_transport.o 00:04:10.543 CXX test/cpp_headers/opal_spec.o 00:04:10.543 CC test/app/jsoncat/jsoncat.o 00:04:10.543 CXX test/cpp_headers/pci_ids.o 00:04:10.543 CXX test/cpp_headers/pipe.o 00:04:10.543 CXX test/cpp_headers/queue.o 00:04:10.543 CXX test/cpp_headers/reduce.o 00:04:10.543 CC app/fio/nvme/fio_plugin.o 00:04:10.543 CXX test/cpp_headers/rpc.o 00:04:10.543 CXX test/cpp_headers/scheduler.o 00:04:10.543 CXX test/cpp_headers/scsi.o 00:04:10.543 CXX test/cpp_headers/scsi_spec.o 00:04:10.543 LINK spdk_lspci 00:04:10.543 CXX test/cpp_headers/sock.o 00:04:10.543 CXX test/cpp_headers/stdinc.o 00:04:10.543 CC test/app/histogram_perf/histogram_perf.o 00:04:10.543 CC test/app/stub/stub.o 00:04:10.543 CXX test/cpp_headers/string.o 00:04:10.543 CC test/thread/lock/spdk_lock.o 00:04:10.543 CC examples/ioat/verify/verify.o 00:04:10.543 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:10.543 CC test/env/memory/memory_ut.o 00:04:10.543 CC test/env/vtophys/vtophys.o 00:04:10.543 CC test/env/pci/pci_ut.o 00:04:10.543 CC test/dma/test_dma/test_dma.o 00:04:10.543 LINK spdk_nvme_discover 00:04:10.543 CXX test/cpp_headers/thread.o 00:04:10.543 LINK rpc_client_test 00:04:10.543 CC test/app/bdev_svc/bdev_svc.o 00:04:10.543 CC app/fio/bdev/fio_plugin.o 00:04:10.543 CC test/env/mem_callbacks/mem_callbacks.o 00:04:10.543 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:10.543 LINK spdk_trace_record 00:04:10.543 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:10.543 LINK nvmf_tgt 00:04:10.543 LINK interrupt_tgt 00:04:10.543 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:10.543 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:10.543 LINK iscsi_tgt 00:04:10.543 LINK jsoncat 00:04:10.543 CXX test/cpp_headers/trace.o 00:04:10.543 LINK poller_perf 00:04:10.543 CXX test/cpp_headers/trace_parser.o 00:04:10.543 CXX test/cpp_headers/tree.o 00:04:10.543 CXX test/cpp_headers/ublk.o 00:04:10.543 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:04:10.804 CXX test/cpp_headers/util.o 00:04:10.804 CXX test/cpp_headers/uuid.o 00:04:10.804 LINK zipf 00:04:10.804 CXX test/cpp_headers/version.o 00:04:10.804 CXX test/cpp_headers/vfio_user_pci.o 00:04:10.804 CXX test/cpp_headers/vfio_user_spec.o 00:04:10.804 CXX test/cpp_headers/vhost.o 00:04:10.804 CXX test/cpp_headers/vmd.o 00:04:10.804 LINK histogram_perf 00:04:10.804 CXX test/cpp_headers/xor.o 00:04:10.804 CXX test/cpp_headers/zipf.o 00:04:10.804 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:04:10.804 LINK vtophys 00:04:10.804 LINK spdk_tgt 00:04:10.804 LINK stub 00:04:10.804 LINK env_dpdk_post_init 00:04:10.804 LINK ioat_perf 00:04:10.804 LINK verify 00:04:10.804 LINK spdk_trace 00:04:10.804 LINK bdev_svc 00:04:10.804 LINK mem_callbacks 00:04:10.804 LINK spdk_dd 00:04:10.804 LINK pci_ut 00:04:10.804 LINK nvme_fuzz 00:04:11.096 LINK test_dma 00:04:11.096 LINK spdk_nvme_identify 00:04:11.096 LINK llvm_vfio_fuzz 00:04:11.096 LINK vhost_fuzz 00:04:11.096 LINK spdk_nvme_perf 00:04:11.096 LINK spdk_top 00:04:11.096 LINK llvm_nvme_fuzz 00:04:11.096 LINK spdk_bdev 00:04:11.096 LINK spdk_nvme 00:04:11.356 CC app/vhost/vhost.o 00:04:11.356 LINK memory_ut 00:04:11.356 CC examples/vmd/lsvmd/lsvmd.o 00:04:11.356 CC examples/idxd/perf/perf.o 00:04:11.356 CC examples/sock/hello_world/hello_sock.o 00:04:11.356 CC examples/vmd/led/led.o 00:04:11.356 CC examples/thread/thread/thread_ex.o 00:04:11.356 LINK vhost 00:04:11.356 LINK lsvmd 00:04:11.356 LINK led 00:04:11.616 LINK hello_sock 00:04:11.616 LINK spdk_lock 00:04:11.616 LINK idxd_perf 00:04:11.616 LINK thread 00:04:11.877 LINK iscsi_fuzz 00:04:12.137 CC test/event/reactor_perf/reactor_perf.o 00:04:12.137 CC test/event/event_perf/event_perf.o 00:04:12.137 CC examples/nvme/reconnect/reconnect.o 00:04:12.137 CC test/event/reactor/reactor.o 00:04:12.137 CC examples/nvme/arbitration/arbitration.o 00:04:12.137 CC examples/nvme/hello_world/hello_world.o 00:04:12.137 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:12.137 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:12.137 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:12.137 CC examples/nvme/abort/abort.o 00:04:12.137 CC examples/nvme/hotplug/hotplug.o 00:04:12.137 CC test/event/app_repeat/app_repeat.o 00:04:12.137 CC test/event/scheduler/scheduler.o 00:04:12.395 LINK event_perf 00:04:12.395 LINK reactor_perf 00:04:12.395 LINK reactor 00:04:12.395 LINK app_repeat 00:04:12.395 LINK pmr_persistence 00:04:12.395 LINK cmb_copy 00:04:12.395 LINK hello_world 00:04:12.395 LINK hotplug 00:04:12.395 LINK scheduler 00:04:12.395 LINK reconnect 00:04:12.395 LINK arbitration 00:04:12.395 LINK abort 00:04:12.395 LINK nvme_manage 00:04:12.655 CC test/nvme/connect_stress/connect_stress.o 00:04:12.655 CC test/nvme/simple_copy/simple_copy.o 00:04:12.655 CC test/nvme/aer/aer.o 00:04:12.655 CC test/nvme/e2edp/nvme_dp.o 00:04:12.655 CC test/nvme/reserve/reserve.o 00:04:12.655 CC test/nvme/fused_ordering/fused_ordering.o 00:04:12.655 CC test/nvme/boot_partition/boot_partition.o 00:04:12.655 CC test/nvme/reset/reset.o 00:04:12.655 CC test/nvme/compliance/nvme_compliance.o 00:04:12.655 CC test/nvme/sgl/sgl.o 00:04:12.655 CC test/nvme/err_injection/err_injection.o 00:04:12.655 CC test/nvme/cuse/cuse.o 00:04:12.655 CC test/nvme/startup/startup.o 00:04:12.655 CC test/nvme/overhead/overhead.o 00:04:12.655 CC test/nvme/fdp/fdp.o 00:04:12.655 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:12.655 CC test/blobfs/mkfs/mkfs.o 00:04:12.655 CC test/accel/dif/dif.o 00:04:12.655 CC test/lvol/esnap/esnap.o 00:04:12.655 LINK connect_stress 00:04:12.655 LINK startup 00:04:12.655 LINK boot_partition 00:04:12.655 LINK reserve 00:04:12.655 LINK err_injection 00:04:12.655 LINK simple_copy 00:04:12.655 LINK fused_ordering 00:04:12.914 LINK doorbell_aers 00:04:12.914 LINK nvme_dp 00:04:12.914 LINK aer 00:04:12.914 LINK sgl 00:04:12.914 LINK reset 00:04:12.914 LINK overhead 00:04:12.914 LINK mkfs 00:04:12.914 LINK fdp 00:04:12.914 LINK nvme_compliance 00:04:13.174 LINK dif 00:04:13.175 CC examples/accel/perf/accel_perf.o 00:04:13.175 CC examples/blob/hello_world/hello_blob.o 00:04:13.175 CC examples/fsdev/hello_world/hello_fsdev.o 00:04:13.175 CC examples/blob/cli/blobcli.o 00:04:13.435 LINK cuse 00:04:13.435 LINK hello_blob 00:04:13.435 LINK hello_fsdev 00:04:13.435 LINK accel_perf 00:04:13.695 LINK blobcli 00:04:14.266 CC examples/bdev/hello_world/hello_bdev.o 00:04:14.266 CC examples/bdev/bdevperf/bdevperf.o 00:04:14.527 LINK hello_bdev 00:04:14.788 CC test/bdev/bdevio/bdevio.o 00:04:14.788 LINK bdevperf 00:04:15.048 LINK bdevio 00:04:15.990 LINK esnap 00:04:16.251 CC examples/nvmf/nvmf/nvmf.o 00:04:16.512 LINK nvmf 00:04:17.894 00:04:17.894 real 0m37.030s 00:04:17.894 user 4m39.995s 00:04:17.894 sys 1m42.789s 00:04:17.894 12:46:20 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:04:17.894 12:46:20 make -- common/autotest_common.sh@10 -- $ set +x 00:04:17.894 ************************************ 00:04:17.894 END TEST make 00:04:17.894 ************************************ 00:04:17.894 12:46:21 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:17.894 12:46:21 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:17.894 12:46:21 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:17.894 12:46:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:17.894 12:46:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:17.894 12:46:21 -- pm/common@44 -- $ pid=6242 00:04:17.894 12:46:21 -- pm/common@50 -- $ kill -TERM 6242 00:04:17.894 12:46:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:17.894 12:46:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:17.894 12:46:21 -- pm/common@44 -- $ pid=6244 00:04:17.894 12:46:21 -- pm/common@50 -- $ kill -TERM 6244 00:04:17.894 12:46:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:17.894 12:46:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:17.894 12:46:21 -- pm/common@44 -- $ pid=6246 00:04:17.894 12:46:21 -- pm/common@50 -- $ kill -TERM 6246 00:04:17.894 12:46:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:17.894 12:46:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:17.894 12:46:21 -- pm/common@44 -- $ pid=6269 00:04:17.894 12:46:21 -- pm/common@50 -- $ sudo -E kill -TERM 6269 00:04:17.894 12:46:21 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:04:17.894 12:46:21 -- spdk/autorun.sh@27 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:04:17.894 12:46:21 -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:04:17.894 12:46:21 -- common/autotest_common.sh@1711 -- # lcov --version 00:04:17.894 12:46:21 -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:04:18.155 12:46:21 -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:04:18.155 12:46:21 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:18.155 12:46:21 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:18.155 12:46:21 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:18.155 12:46:21 -- scripts/common.sh@336 -- # IFS=.-: 00:04:18.155 12:46:21 -- scripts/common.sh@336 -- # read -ra ver1 00:04:18.155 12:46:21 -- scripts/common.sh@337 -- # IFS=.-: 00:04:18.155 12:46:21 -- scripts/common.sh@337 -- # read -ra ver2 00:04:18.155 12:46:21 -- scripts/common.sh@338 -- # local 'op=<' 00:04:18.155 12:46:21 -- scripts/common.sh@340 -- # ver1_l=2 00:04:18.155 12:46:21 -- scripts/common.sh@341 -- # ver2_l=1 00:04:18.155 12:46:21 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:18.155 12:46:21 -- scripts/common.sh@344 -- # case "$op" in 00:04:18.155 12:46:21 -- scripts/common.sh@345 -- # : 1 00:04:18.155 12:46:21 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:18.155 12:46:21 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:18.155 12:46:21 -- scripts/common.sh@365 -- # decimal 1 00:04:18.155 12:46:21 -- scripts/common.sh@353 -- # local d=1 00:04:18.155 12:46:21 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:18.155 12:46:21 -- scripts/common.sh@355 -- # echo 1 00:04:18.155 12:46:21 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:18.155 12:46:21 -- scripts/common.sh@366 -- # decimal 2 00:04:18.155 12:46:21 -- scripts/common.sh@353 -- # local d=2 00:04:18.155 12:46:21 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:18.155 12:46:21 -- scripts/common.sh@355 -- # echo 2 00:04:18.155 12:46:21 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:18.155 12:46:21 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:18.155 12:46:21 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:18.155 12:46:21 -- scripts/common.sh@368 -- # return 0 00:04:18.155 12:46:21 -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:18.155 12:46:21 -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:04:18.155 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.155 --rc genhtml_branch_coverage=1 00:04:18.155 --rc genhtml_function_coverage=1 00:04:18.155 --rc genhtml_legend=1 00:04:18.155 --rc geninfo_all_blocks=1 00:04:18.155 --rc geninfo_unexecuted_blocks=1 00:04:18.155 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.155 ' 00:04:18.155 12:46:21 -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:04:18.155 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.155 --rc genhtml_branch_coverage=1 00:04:18.155 --rc genhtml_function_coverage=1 00:04:18.155 --rc genhtml_legend=1 00:04:18.155 --rc geninfo_all_blocks=1 00:04:18.155 --rc geninfo_unexecuted_blocks=1 00:04:18.155 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.155 ' 00:04:18.155 12:46:21 -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:04:18.155 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.155 --rc genhtml_branch_coverage=1 00:04:18.155 --rc genhtml_function_coverage=1 00:04:18.155 --rc genhtml_legend=1 00:04:18.155 --rc geninfo_all_blocks=1 00:04:18.155 --rc geninfo_unexecuted_blocks=1 00:04:18.155 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.155 ' 00:04:18.155 12:46:21 -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:04:18.155 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.155 --rc genhtml_branch_coverage=1 00:04:18.155 --rc genhtml_function_coverage=1 00:04:18.155 --rc genhtml_legend=1 00:04:18.155 --rc geninfo_all_blocks=1 00:04:18.155 --rc geninfo_unexecuted_blocks=1 00:04:18.155 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.155 ' 00:04:18.155 12:46:21 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:18.155 12:46:21 -- nvmf/common.sh@7 -- # uname -s 00:04:18.155 12:46:21 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:18.155 12:46:21 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:18.155 12:46:21 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:18.155 12:46:21 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:18.155 12:46:21 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:18.155 12:46:21 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:18.155 12:46:21 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:18.155 12:46:21 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:18.155 12:46:21 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:18.155 12:46:21 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:18.155 12:46:21 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:18.155 12:46:21 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:18.155 12:46:21 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:18.155 12:46:21 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:18.155 12:46:21 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:18.155 12:46:21 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:18.155 12:46:21 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:18.155 12:46:21 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:18.155 12:46:21 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:18.155 12:46:21 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:18.155 12:46:21 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:18.155 12:46:21 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:18.156 12:46:21 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:18.156 12:46:21 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:18.156 12:46:21 -- paths/export.sh@5 -- # export PATH 00:04:18.156 12:46:21 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:18.156 12:46:21 -- nvmf/common.sh@51 -- # : 0 00:04:18.156 12:46:21 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:18.156 12:46:21 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:18.156 12:46:21 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:18.156 12:46:21 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:18.156 12:46:21 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:18.156 12:46:21 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:18.156 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:18.156 12:46:21 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:18.156 12:46:21 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:18.156 12:46:21 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:18.156 12:46:21 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:18.156 12:46:21 -- spdk/autotest.sh@32 -- # uname -s 00:04:18.156 12:46:21 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:18.156 12:46:21 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:18.156 12:46:21 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:18.156 12:46:21 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:18.156 12:46:21 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:18.156 12:46:21 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:18.156 12:46:21 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:18.156 12:46:21 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:18.156 12:46:21 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:18.156 12:46:21 -- spdk/autotest.sh@48 -- # udevadm_pid=85242 00:04:18.156 12:46:21 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:18.156 12:46:21 -- pm/common@17 -- # local monitor 00:04:18.156 12:46:21 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.156 12:46:21 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.156 12:46:21 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.156 12:46:21 -- pm/common@21 -- # date +%s 00:04:18.156 12:46:21 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:18.156 12:46:21 -- pm/common@25 -- # sleep 1 00:04:18.156 12:46:21 -- pm/common@21 -- # date +%s 00:04:18.156 12:46:21 -- pm/common@21 -- # date +%s 00:04:18.156 12:46:21 -- pm/common@21 -- # date +%s 00:04:18.156 12:46:21 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1733399181 00:04:18.156 12:46:21 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1733399181 00:04:18.156 12:46:21 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1733399181 00:04:18.156 12:46:21 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1733399181 00:04:18.156 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1733399181_collect-vmstat.pm.log 00:04:18.156 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1733399181_collect-cpu-load.pm.log 00:04:18.156 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1733399181_collect-cpu-temp.pm.log 00:04:18.156 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1733399181_collect-bmc-pm.bmc.pm.log 00:04:19.098 12:46:22 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:19.098 12:46:22 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:19.098 12:46:22 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:19.098 12:46:22 -- common/autotest_common.sh@10 -- # set +x 00:04:19.098 12:46:22 -- spdk/autotest.sh@59 -- # create_test_list 00:04:19.098 12:46:22 -- common/autotest_common.sh@752 -- # xtrace_disable 00:04:19.098 12:46:22 -- common/autotest_common.sh@10 -- # set +x 00:04:19.359 12:46:22 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:19.359 12:46:22 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:19.359 12:46:22 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:19.359 12:46:22 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:19.359 12:46:22 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:19.359 12:46:22 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:19.359 12:46:22 -- common/autotest_common.sh@1457 -- # uname 00:04:19.359 12:46:22 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:04:19.359 12:46:22 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:19.359 12:46:22 -- common/autotest_common.sh@1477 -- # uname 00:04:19.359 12:46:22 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:04:19.359 12:46:22 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:19.359 12:46:22 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:19.359 lcov: LCOV version 1.15 00:04:19.359 12:46:22 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:25.943 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:27.327 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:35.465 12:46:38 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:35.465 12:46:38 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:35.465 12:46:38 -- common/autotest_common.sh@10 -- # set +x 00:04:35.465 12:46:38 -- spdk/autotest.sh@78 -- # rm -f 00:04:35.465 12:46:38 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:38.768 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:38.768 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:38.768 12:46:42 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:38.768 12:46:42 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:38.768 12:46:42 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:38.768 12:46:42 -- common/autotest_common.sh@1658 -- # zoned_ctrls=() 00:04:38.768 12:46:42 -- common/autotest_common.sh@1658 -- # local -A zoned_ctrls 00:04:38.768 12:46:42 -- common/autotest_common.sh@1659 -- # local nvme bdf ns 00:04:38.768 12:46:42 -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:04:38.768 12:46:42 -- common/autotest_common.sh@1669 -- # bdf=0000:d8:00.0 00:04:38.768 12:46:42 -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:04:38.768 12:46:42 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:04:38.768 12:46:42 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:38.768 12:46:42 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:38.768 12:46:42 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:38.768 12:46:42 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:38.768 12:46:42 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:38.768 12:46:42 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:38.768 12:46:42 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:38.768 12:46:42 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:38.768 12:46:42 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:39.028 No valid GPT data, bailing 00:04:39.028 12:46:42 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:39.028 12:46:42 -- scripts/common.sh@394 -- # pt= 00:04:39.028 12:46:42 -- scripts/common.sh@395 -- # return 1 00:04:39.028 12:46:42 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:39.028 1+0 records in 00:04:39.028 1+0 records out 00:04:39.028 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00543038 s, 193 MB/s 00:04:39.028 12:46:42 -- spdk/autotest.sh@105 -- # sync 00:04:39.028 12:46:42 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:39.028 12:46:42 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:39.028 12:46:42 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:47.165 12:46:49 -- spdk/autotest.sh@111 -- # uname -s 00:04:47.165 12:46:49 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:47.165 12:46:49 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:47.165 12:46:49 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:47.165 12:46:49 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:47.165 12:46:49 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:47.165 12:46:49 -- common/autotest_common.sh@10 -- # set +x 00:04:47.165 ************************************ 00:04:47.165 START TEST setup.sh 00:04:47.165 ************************************ 00:04:47.165 12:46:49 setup.sh -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:47.165 * Looking for test storage... 00:04:47.165 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:47.165 12:46:49 setup.sh -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:04:47.165 12:46:49 setup.sh -- common/autotest_common.sh@1711 -- # lcov --version 00:04:47.165 12:46:49 setup.sh -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:04:47.165 12:46:49 setup.sh -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:47.165 12:46:49 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:47.166 12:46:49 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:47.166 12:46:49 setup.sh -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:47.166 12:46:49 setup.sh -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:04:47.166 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.166 --rc genhtml_branch_coverage=1 00:04:47.166 --rc genhtml_function_coverage=1 00:04:47.166 --rc genhtml_legend=1 00:04:47.166 --rc geninfo_all_blocks=1 00:04:47.166 --rc geninfo_unexecuted_blocks=1 00:04:47.166 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.166 ' 00:04:47.166 12:46:49 setup.sh -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:04:47.166 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.166 --rc genhtml_branch_coverage=1 00:04:47.166 --rc genhtml_function_coverage=1 00:04:47.166 --rc genhtml_legend=1 00:04:47.166 --rc geninfo_all_blocks=1 00:04:47.166 --rc geninfo_unexecuted_blocks=1 00:04:47.166 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.166 ' 00:04:47.166 12:46:49 setup.sh -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:04:47.166 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.166 --rc genhtml_branch_coverage=1 00:04:47.166 --rc genhtml_function_coverage=1 00:04:47.166 --rc genhtml_legend=1 00:04:47.166 --rc geninfo_all_blocks=1 00:04:47.166 --rc geninfo_unexecuted_blocks=1 00:04:47.166 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.166 ' 00:04:47.166 12:46:49 setup.sh -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:04:47.166 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.166 --rc genhtml_branch_coverage=1 00:04:47.166 --rc genhtml_function_coverage=1 00:04:47.166 --rc genhtml_legend=1 00:04:47.166 --rc geninfo_all_blocks=1 00:04:47.166 --rc geninfo_unexecuted_blocks=1 00:04:47.166 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.166 ' 00:04:47.166 12:46:49 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:47.166 12:46:49 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:47.166 12:46:49 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:47.166 12:46:49 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:47.166 12:46:49 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:47.166 12:46:49 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:47.166 ************************************ 00:04:47.166 START TEST acl 00:04:47.166 ************************************ 00:04:47.166 12:46:49 setup.sh.acl -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:47.166 * Looking for test storage... 00:04:47.166 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1711 -- # lcov --version 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:47.166 12:46:50 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:04:47.166 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.166 --rc genhtml_branch_coverage=1 00:04:47.166 --rc genhtml_function_coverage=1 00:04:47.166 --rc genhtml_legend=1 00:04:47.166 --rc geninfo_all_blocks=1 00:04:47.166 --rc geninfo_unexecuted_blocks=1 00:04:47.166 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.166 ' 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:04:47.166 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.166 --rc genhtml_branch_coverage=1 00:04:47.166 --rc genhtml_function_coverage=1 00:04:47.166 --rc genhtml_legend=1 00:04:47.166 --rc geninfo_all_blocks=1 00:04:47.166 --rc geninfo_unexecuted_blocks=1 00:04:47.166 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.166 ' 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:04:47.166 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.166 --rc genhtml_branch_coverage=1 00:04:47.166 --rc genhtml_function_coverage=1 00:04:47.166 --rc genhtml_legend=1 00:04:47.166 --rc geninfo_all_blocks=1 00:04:47.166 --rc geninfo_unexecuted_blocks=1 00:04:47.166 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.166 ' 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:04:47.166 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.166 --rc genhtml_branch_coverage=1 00:04:47.166 --rc genhtml_function_coverage=1 00:04:47.166 --rc genhtml_legend=1 00:04:47.166 --rc geninfo_all_blocks=1 00:04:47.166 --rc geninfo_unexecuted_blocks=1 00:04:47.166 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.166 ' 00:04:47.166 12:46:50 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1658 -- # zoned_ctrls=() 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1658 -- # local -A zoned_ctrls 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1659 -- # local nvme bdf ns 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1669 -- # bdf=0000:d8:00.0 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:47.166 12:46:50 setup.sh.acl -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:47.166 12:46:50 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:47.166 12:46:50 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:47.166 12:46:50 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:47.166 12:46:50 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:47.166 12:46:50 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:47.166 12:46:50 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:47.166 12:46:50 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:51.372 12:46:53 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:51.372 12:46:53 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:51.372 12:46:53 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:51.372 12:46:53 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:51.372 12:46:53 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.372 12:46:53 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:54.669 Hugepages 00:04:54.669 node hugesize free / total 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:54.669 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.669 00:04:54.670 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:54.670 12:46:57 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:54.670 12:46:57 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:54.670 12:46:57 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:54.670 12:46:57 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:54.670 ************************************ 00:04:54.670 START TEST denied 00:04:54.670 ************************************ 00:04:54.670 12:46:57 setup.sh.acl.denied -- common/autotest_common.sh@1129 -- # denied 00:04:54.670 12:46:57 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:54.670 12:46:57 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:54.670 12:46:57 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:54.670 12:46:57 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.670 12:46:57 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:58.875 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:58.875 12:47:01 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:03.077 00:05:03.077 real 0m8.497s 00:05:03.077 user 0m2.658s 00:05:03.077 sys 0m5.176s 00:05:03.077 12:47:06 setup.sh.acl.denied -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:03.077 12:47:06 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:05:03.077 ************************************ 00:05:03.077 END TEST denied 00:05:03.077 ************************************ 00:05:03.077 12:47:06 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:05:03.077 12:47:06 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:03.077 12:47:06 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:03.078 12:47:06 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:03.078 ************************************ 00:05:03.078 START TEST allowed 00:05:03.078 ************************************ 00:05:03.078 12:47:06 setup.sh.acl.allowed -- common/autotest_common.sh@1129 -- # allowed 00:05:03.078 12:47:06 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:05:03.078 12:47:06 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:05:03.078 12:47:06 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:05:03.078 12:47:06 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.078 12:47:06 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:08.362 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:08.362 12:47:11 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:08.362 12:47:11 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:08.362 12:47:11 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:08.362 12:47:11 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:08.363 12:47:11 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:12.567 00:05:12.567 real 0m9.136s 00:05:12.567 user 0m2.626s 00:05:12.567 sys 0m5.114s 00:05:12.567 12:47:15 setup.sh.acl.allowed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:12.567 12:47:15 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:12.567 ************************************ 00:05:12.567 END TEST allowed 00:05:12.567 ************************************ 00:05:12.567 00:05:12.567 real 0m25.489s 00:05:12.567 user 0m8.182s 00:05:12.567 sys 0m15.532s 00:05:12.567 12:47:15 setup.sh.acl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:12.567 12:47:15 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:12.567 ************************************ 00:05:12.567 END TEST acl 00:05:12.567 ************************************ 00:05:12.567 12:47:15 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:12.567 12:47:15 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:12.567 12:47:15 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:12.567 12:47:15 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:12.567 ************************************ 00:05:12.567 START TEST hugepages 00:05:12.567 ************************************ 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:12.567 * Looking for test storage... 00:05:12.567 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1711 -- # lcov --version 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:12.567 12:47:15 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:12.567 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.567 --rc genhtml_branch_coverage=1 00:05:12.567 --rc genhtml_function_coverage=1 00:05:12.567 --rc genhtml_legend=1 00:05:12.567 --rc geninfo_all_blocks=1 00:05:12.567 --rc geninfo_unexecuted_blocks=1 00:05:12.567 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.567 ' 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:12.567 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.567 --rc genhtml_branch_coverage=1 00:05:12.567 --rc genhtml_function_coverage=1 00:05:12.567 --rc genhtml_legend=1 00:05:12.567 --rc geninfo_all_blocks=1 00:05:12.567 --rc geninfo_unexecuted_blocks=1 00:05:12.567 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.567 ' 00:05:12.567 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:12.567 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.567 --rc genhtml_branch_coverage=1 00:05:12.567 --rc genhtml_function_coverage=1 00:05:12.567 --rc genhtml_legend=1 00:05:12.567 --rc geninfo_all_blocks=1 00:05:12.567 --rc geninfo_unexecuted_blocks=1 00:05:12.568 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.568 ' 00:05:12.568 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:12.568 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:12.568 --rc genhtml_branch_coverage=1 00:05:12.568 --rc genhtml_function_coverage=1 00:05:12.568 --rc genhtml_legend=1 00:05:12.568 --rc geninfo_all_blocks=1 00:05:12.568 --rc geninfo_unexecuted_blocks=1 00:05:12.568 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:12.568 ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 39283956 kB' 'MemAvailable: 42969816 kB' 'Buffers: 8940 kB' 'Cached: 12561400 kB' 'SwapCached: 0 kB' 'Active: 9605980 kB' 'Inactive: 3663076 kB' 'Active(anon): 9201408 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 702164 kB' 'Mapped: 143384 kB' 'Shmem: 8502692 kB' 'KReclaimable: 230156 kB' 'Slab: 844112 kB' 'SReclaimable: 230156 kB' 'SUnreclaim: 613956 kB' 'KernelStack: 21840 kB' 'PageTables: 8020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433336 kB' 'Committed_AS: 10950328 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214192 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.568 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:12.569 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:12.570 12:47:15 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:05:12.570 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:12.570 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:12.570 12:47:15 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:12.570 ************************************ 00:05:12.570 START TEST single_node_setup 00:05:12.570 ************************************ 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1129 -- # single_node_setup 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:12.570 12:47:15 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:15.870 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:16.130 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:18.047 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41495824 kB' 'MemAvailable: 45181024 kB' 'Buffers: 8940 kB' 'Cached: 12561552 kB' 'SwapCached: 0 kB' 'Active: 9611788 kB' 'Inactive: 3663076 kB' 'Active(anon): 9207216 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 707836 kB' 'Mapped: 143036 kB' 'Shmem: 8502844 kB' 'KReclaimable: 228836 kB' 'Slab: 842056 kB' 'SReclaimable: 228836 kB' 'SUnreclaim: 613220 kB' 'KernelStack: 21808 kB' 'PageTables: 7656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10951064 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214224 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.047 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.048 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41495048 kB' 'MemAvailable: 45180244 kB' 'Buffers: 8940 kB' 'Cached: 12561552 kB' 'SwapCached: 0 kB' 'Active: 9610868 kB' 'Inactive: 3663076 kB' 'Active(anon): 9206296 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 707348 kB' 'Mapped: 142948 kB' 'Shmem: 8502844 kB' 'KReclaimable: 228828 kB' 'Slab: 842012 kB' 'SReclaimable: 228828 kB' 'SUnreclaim: 613184 kB' 'KernelStack: 21904 kB' 'PageTables: 8008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10951332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214208 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.049 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.050 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41495264 kB' 'MemAvailable: 45180460 kB' 'Buffers: 8940 kB' 'Cached: 12561568 kB' 'SwapCached: 0 kB' 'Active: 9610980 kB' 'Inactive: 3663076 kB' 'Active(anon): 9206408 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 706880 kB' 'Mapped: 142948 kB' 'Shmem: 8502860 kB' 'KReclaimable: 228828 kB' 'Slab: 841940 kB' 'SReclaimable: 228828 kB' 'SUnreclaim: 613112 kB' 'KernelStack: 21872 kB' 'PageTables: 7720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10949852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214208 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.051 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.052 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:18.053 nr_hugepages=1024 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:18.053 resv_hugepages=0 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:18.053 surplus_hugepages=0 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:18.053 anon_hugepages=0 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41493752 kB' 'MemAvailable: 45178948 kB' 'Buffers: 8940 kB' 'Cached: 12561568 kB' 'SwapCached: 0 kB' 'Active: 9611612 kB' 'Inactive: 3663076 kB' 'Active(anon): 9207040 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 707512 kB' 'Mapped: 142964 kB' 'Shmem: 8502860 kB' 'KReclaimable: 228828 kB' 'Slab: 841940 kB' 'SReclaimable: 228828 kB' 'SUnreclaim: 613112 kB' 'KernelStack: 22000 kB' 'PageTables: 7872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10951376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.053 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.054 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25767804 kB' 'MemUsed: 6817564 kB' 'SwapCached: 0 kB' 'Active: 3143720 kB' 'Inactive: 143272 kB' 'Active(anon): 2968148 kB' 'Inactive(anon): 0 kB' 'Active(file): 175572 kB' 'Inactive(file): 143272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3106104 kB' 'Mapped: 47264 kB' 'AnonPages: 184116 kB' 'Shmem: 2787260 kB' 'KernelStack: 11608 kB' 'PageTables: 3176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89672 kB' 'Slab: 379036 kB' 'SReclaimable: 89672 kB' 'SUnreclaim: 289364 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.055 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:18.056 node0=1024 expecting 1024 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:18.056 00:05:18.056 real 0m5.439s 00:05:18.056 user 0m1.449s 00:05:18.056 sys 0m2.520s 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:18.056 12:47:21 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:05:18.056 ************************************ 00:05:18.056 END TEST single_node_setup 00:05:18.056 ************************************ 00:05:18.056 12:47:21 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:05:18.056 12:47:21 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:18.056 12:47:21 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:18.056 12:47:21 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:18.056 ************************************ 00:05:18.056 START TEST even_2G_alloc 00:05:18.056 ************************************ 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1129 -- # even_2G_alloc 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:05:18.056 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:05:18.057 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:18.057 12:47:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:21.354 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.354 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.354 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.354 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.354 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.354 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.354 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.354 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.618 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41508980 kB' 'MemAvailable: 45194152 kB' 'Buffers: 8940 kB' 'Cached: 12561828 kB' 'SwapCached: 0 kB' 'Active: 9614496 kB' 'Inactive: 3663076 kB' 'Active(anon): 9209924 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 710240 kB' 'Mapped: 142056 kB' 'Shmem: 8503120 kB' 'KReclaimable: 228780 kB' 'Slab: 841864 kB' 'SReclaimable: 228780 kB' 'SUnreclaim: 613084 kB' 'KernelStack: 21808 kB' 'PageTables: 7840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10942292 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214224 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.618 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.619 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41509760 kB' 'MemAvailable: 45194932 kB' 'Buffers: 8940 kB' 'Cached: 12561828 kB' 'SwapCached: 0 kB' 'Active: 9613676 kB' 'Inactive: 3663076 kB' 'Active(anon): 9209104 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 709352 kB' 'Mapped: 141972 kB' 'Shmem: 8503120 kB' 'KReclaimable: 228780 kB' 'Slab: 841908 kB' 'SReclaimable: 228780 kB' 'SUnreclaim: 613128 kB' 'KernelStack: 21728 kB' 'PageTables: 7556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10942308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.620 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41510536 kB' 'MemAvailable: 45195708 kB' 'Buffers: 8940 kB' 'Cached: 12561852 kB' 'SwapCached: 0 kB' 'Active: 9614612 kB' 'Inactive: 3663076 kB' 'Active(anon): 9210040 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 710276 kB' 'Mapped: 141972 kB' 'Shmem: 8503144 kB' 'KReclaimable: 228780 kB' 'Slab: 841908 kB' 'SReclaimable: 228780 kB' 'SUnreclaim: 613128 kB' 'KernelStack: 21728 kB' 'PageTables: 7556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10954604 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.621 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.622 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.623 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.904 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.904 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.904 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.905 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:21.906 nr_hugepages=1024 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:21.906 resv_hugepages=0 00:05:21.906 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:21.906 surplus_hugepages=0 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:21.907 anon_hugepages=0 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.907 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41512896 kB' 'MemAvailable: 45198068 kB' 'Buffers: 8940 kB' 'Cached: 12561892 kB' 'SwapCached: 0 kB' 'Active: 9613860 kB' 'Inactive: 3663076 kB' 'Active(anon): 9209288 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 709472 kB' 'Mapped: 141972 kB' 'Shmem: 8503184 kB' 'KReclaimable: 228780 kB' 'Slab: 841868 kB' 'SReclaimable: 228780 kB' 'SUnreclaim: 613088 kB' 'KernelStack: 21696 kB' 'PageTables: 7436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10941984 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214112 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.908 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.909 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.910 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.911 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.912 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:21.913 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26826988 kB' 'MemUsed: 5758380 kB' 'SwapCached: 0 kB' 'Active: 3143536 kB' 'Inactive: 143272 kB' 'Active(anon): 2967964 kB' 'Inactive(anon): 0 kB' 'Active(file): 175572 kB' 'Inactive(file): 143272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3106320 kB' 'Mapped: 47064 kB' 'AnonPages: 183644 kB' 'Shmem: 2787476 kB' 'KernelStack: 11528 kB' 'PageTables: 2828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89672 kB' 'Slab: 379000 kB' 'SReclaimable: 89672 kB' 'SUnreclaim: 289328 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.914 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 14686380 kB' 'MemUsed: 13012024 kB' 'SwapCached: 0 kB' 'Active: 6470896 kB' 'Inactive: 3519804 kB' 'Active(anon): 6241896 kB' 'Inactive(anon): 0 kB' 'Active(file): 229000 kB' 'Inactive(file): 3519804 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9464536 kB' 'Mapped: 94908 kB' 'AnonPages: 526308 kB' 'Shmem: 5715732 kB' 'KernelStack: 10184 kB' 'PageTables: 4644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 139108 kB' 'Slab: 462868 kB' 'SReclaimable: 139108 kB' 'SUnreclaim: 323760 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.915 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:21.916 node0=512 expecting 512 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:21.916 node1=512 expecting 512 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:21.916 00:05:21.916 real 0m3.744s 00:05:21.916 user 0m1.405s 00:05:21.916 sys 0m2.406s 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:21.916 12:47:25 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:21.916 ************************************ 00:05:21.916 END TEST even_2G_alloc 00:05:21.916 ************************************ 00:05:21.916 12:47:25 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:21.916 12:47:25 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:21.916 12:47:25 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:21.916 12:47:25 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:21.916 ************************************ 00:05:21.916 START TEST odd_alloc 00:05:21.916 ************************************ 00:05:21.916 12:47:25 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1129 -- # odd_alloc 00:05:21.916 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:21.916 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.917 12:47:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:25.218 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:25.218 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:25.482 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:25.482 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:25.482 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:25.482 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41511184 kB' 'MemAvailable: 45196372 kB' 'Buffers: 8940 kB' 'Cached: 12562012 kB' 'SwapCached: 0 kB' 'Active: 9619956 kB' 'Inactive: 3663076 kB' 'Active(anon): 9215384 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 715012 kB' 'Mapped: 142088 kB' 'Shmem: 8503304 kB' 'KReclaimable: 228812 kB' 'Slab: 842088 kB' 'SReclaimable: 228812 kB' 'SUnreclaim: 613276 kB' 'KernelStack: 21776 kB' 'PageTables: 7756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 10943312 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214192 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.482 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41511748 kB' 'MemAvailable: 45196936 kB' 'Buffers: 8940 kB' 'Cached: 12562032 kB' 'SwapCached: 0 kB' 'Active: 9618684 kB' 'Inactive: 3663076 kB' 'Active(anon): 9214112 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 714128 kB' 'Mapped: 141992 kB' 'Shmem: 8503324 kB' 'KReclaimable: 228812 kB' 'Slab: 842060 kB' 'SReclaimable: 228812 kB' 'SUnreclaim: 613248 kB' 'KernelStack: 21728 kB' 'PageTables: 7560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 10943328 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.483 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.484 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41512192 kB' 'MemAvailable: 45197380 kB' 'Buffers: 8940 kB' 'Cached: 12562032 kB' 'SwapCached: 0 kB' 'Active: 9618852 kB' 'Inactive: 3663076 kB' 'Active(anon): 9214280 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 714368 kB' 'Mapped: 141992 kB' 'Shmem: 8503324 kB' 'KReclaimable: 228812 kB' 'Slab: 842060 kB' 'SReclaimable: 228812 kB' 'SUnreclaim: 613248 kB' 'KernelStack: 21728 kB' 'PageTables: 7564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 10943348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.485 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.486 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:25.487 nr_hugepages=1025 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:25.487 resv_hugepages=0 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:25.487 surplus_hugepages=0 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:25.487 anon_hugepages=0 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41512612 kB' 'MemAvailable: 45197800 kB' 'Buffers: 8940 kB' 'Cached: 12562056 kB' 'SwapCached: 0 kB' 'Active: 9618944 kB' 'Inactive: 3663076 kB' 'Active(anon): 9214372 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 714468 kB' 'Mapped: 141992 kB' 'Shmem: 8503348 kB' 'KReclaimable: 228812 kB' 'Slab: 842060 kB' 'SReclaimable: 228812 kB' 'SUnreclaim: 613248 kB' 'KernelStack: 21760 kB' 'PageTables: 7628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 10944500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.487 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.488 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.752 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26819620 kB' 'MemUsed: 5765748 kB' 'SwapCached: 0 kB' 'Active: 3145396 kB' 'Inactive: 143272 kB' 'Active(anon): 2969824 kB' 'Inactive(anon): 0 kB' 'Active(file): 175572 kB' 'Inactive(file): 143272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3106380 kB' 'Mapped: 47084 kB' 'AnonPages: 185624 kB' 'Shmem: 2787536 kB' 'KernelStack: 11544 kB' 'PageTables: 2912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89672 kB' 'Slab: 379120 kB' 'SReclaimable: 89672 kB' 'SUnreclaim: 289448 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.753 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:25.754 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 14697176 kB' 'MemUsed: 13001228 kB' 'SwapCached: 0 kB' 'Active: 6474324 kB' 'Inactive: 3519804 kB' 'Active(anon): 6245324 kB' 'Inactive(anon): 0 kB' 'Active(file): 229000 kB' 'Inactive(file): 3519804 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9464636 kB' 'Mapped: 94908 kB' 'AnonPages: 529724 kB' 'Shmem: 5715832 kB' 'KernelStack: 10152 kB' 'PageTables: 4560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 139140 kB' 'Slab: 462940 kB' 'SReclaimable: 139140 kB' 'SUnreclaim: 323800 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.755 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:25.756 node0=513 expecting 513 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:25.756 node1=512 expecting 512 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:25.756 00:05:25.756 real 0m3.749s 00:05:25.756 user 0m1.383s 00:05:25.756 sys 0m2.431s 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:25.756 12:47:28 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:25.756 ************************************ 00:05:25.756 END TEST odd_alloc 00:05:25.756 ************************************ 00:05:25.756 12:47:28 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:25.756 12:47:28 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:25.756 12:47:28 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:25.756 12:47:28 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:25.756 ************************************ 00:05:25.756 START TEST custom_alloc 00:05:25.756 ************************************ 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1129 -- # custom_alloc 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:25.756 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:25.757 12:47:28 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:29.059 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:29.059 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:29.325 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:29.325 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:29.325 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:29.325 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:29.325 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:29.325 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.325 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40474272 kB' 'MemAvailable: 44159460 kB' 'Buffers: 8940 kB' 'Cached: 12562428 kB' 'SwapCached: 0 kB' 'Active: 9622624 kB' 'Inactive: 3663076 kB' 'Active(anon): 9218052 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 717176 kB' 'Mapped: 142432 kB' 'Shmem: 8503720 kB' 'KReclaimable: 228812 kB' 'Slab: 842300 kB' 'SReclaimable: 228812 kB' 'SUnreclaim: 613488 kB' 'KernelStack: 21728 kB' 'PageTables: 7608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 10944244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.326 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.327 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40476024 kB' 'MemAvailable: 44161180 kB' 'Buffers: 8940 kB' 'Cached: 12562432 kB' 'SwapCached: 0 kB' 'Active: 9622176 kB' 'Inactive: 3663076 kB' 'Active(anon): 9217604 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 717308 kB' 'Mapped: 142004 kB' 'Shmem: 8503724 kB' 'KReclaimable: 228748 kB' 'Slab: 842188 kB' 'SReclaimable: 228748 kB' 'SUnreclaim: 613440 kB' 'KernelStack: 21744 kB' 'PageTables: 7620 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 10944260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.328 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.329 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40476052 kB' 'MemAvailable: 44161208 kB' 'Buffers: 8940 kB' 'Cached: 12562448 kB' 'SwapCached: 0 kB' 'Active: 9622200 kB' 'Inactive: 3663076 kB' 'Active(anon): 9217628 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 717304 kB' 'Mapped: 142004 kB' 'Shmem: 8503740 kB' 'KReclaimable: 228748 kB' 'Slab: 842188 kB' 'SReclaimable: 228748 kB' 'SUnreclaim: 613440 kB' 'KernelStack: 21744 kB' 'PageTables: 7620 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 10944280 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.330 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.331 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.332 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:29.333 nr_hugepages=1536 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:29.333 resv_hugepages=0 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:29.333 surplus_hugepages=0 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:29.333 anon_hugepages=0 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.333 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40476640 kB' 'MemAvailable: 44161796 kB' 'Buffers: 8940 kB' 'Cached: 12562476 kB' 'SwapCached: 0 kB' 'Active: 9622444 kB' 'Inactive: 3663076 kB' 'Active(anon): 9217872 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 717500 kB' 'Mapped: 142004 kB' 'Shmem: 8503768 kB' 'KReclaimable: 228748 kB' 'Slab: 842188 kB' 'SReclaimable: 228748 kB' 'SUnreclaim: 613440 kB' 'KernelStack: 21760 kB' 'PageTables: 7672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 10944304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.598 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.599 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:29.600 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 26823872 kB' 'MemUsed: 5761496 kB' 'SwapCached: 0 kB' 'Active: 3145364 kB' 'Inactive: 143272 kB' 'Active(anon): 2969792 kB' 'Inactive(anon): 0 kB' 'Active(file): 175572 kB' 'Inactive(file): 143272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3106664 kB' 'Mapped: 47096 kB' 'AnonPages: 185148 kB' 'Shmem: 2787820 kB' 'KernelStack: 11592 kB' 'PageTables: 3064 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89672 kB' 'Slab: 379508 kB' 'SReclaimable: 89672 kB' 'SUnreclaim: 289836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.601 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.602 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 13653572 kB' 'MemUsed: 14044832 kB' 'SwapCached: 0 kB' 'Active: 6477920 kB' 'Inactive: 3519804 kB' 'Active(anon): 6248920 kB' 'Inactive(anon): 0 kB' 'Active(file): 229000 kB' 'Inactive(file): 3519804 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9464788 kB' 'Mapped: 94908 kB' 'AnonPages: 533124 kB' 'Shmem: 5715984 kB' 'KernelStack: 10168 kB' 'PageTables: 4608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 139076 kB' 'Slab: 462680 kB' 'SReclaimable: 139076 kB' 'SUnreclaim: 323604 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.603 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:29.604 node0=512 expecting 512 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:29.604 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:29.605 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:29.605 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:29.605 node1=1024 expecting 1024 00:05:29.605 12:47:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:29.605 00:05:29.605 real 0m3.770s 00:05:29.605 user 0m1.446s 00:05:29.605 sys 0m2.393s 00:05:29.605 12:47:32 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:29.605 12:47:32 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:29.605 ************************************ 00:05:29.605 END TEST custom_alloc 00:05:29.605 ************************************ 00:05:29.605 12:47:32 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:29.605 12:47:32 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:29.605 12:47:32 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:29.605 12:47:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:29.605 ************************************ 00:05:29.605 START TEST no_shrink_alloc 00:05:29.605 ************************************ 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1129 -- # no_shrink_alloc 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:29.605 12:47:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:32.908 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:32.908 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:33.173 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:33.173 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:33.173 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:33.173 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:33.173 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:33.173 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41492036 kB' 'MemAvailable: 45177192 kB' 'Buffers: 8940 kB' 'Cached: 12562716 kB' 'SwapCached: 0 kB' 'Active: 9626448 kB' 'Inactive: 3663076 kB' 'Active(anon): 9221876 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 721216 kB' 'Mapped: 142052 kB' 'Shmem: 8504008 kB' 'KReclaimable: 228748 kB' 'Slab: 842544 kB' 'SReclaimable: 228748 kB' 'SUnreclaim: 613796 kB' 'KernelStack: 21712 kB' 'PageTables: 7500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10944676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.173 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.174 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.175 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41492772 kB' 'MemAvailable: 45177928 kB' 'Buffers: 8940 kB' 'Cached: 12562736 kB' 'SwapCached: 0 kB' 'Active: 9625824 kB' 'Inactive: 3663076 kB' 'Active(anon): 9221252 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 720544 kB' 'Mapped: 142048 kB' 'Shmem: 8504028 kB' 'KReclaimable: 228748 kB' 'Slab: 842460 kB' 'SReclaimable: 228748 kB' 'SUnreclaim: 613712 kB' 'KernelStack: 21680 kB' 'PageTables: 7380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10944832 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.176 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.177 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41492520 kB' 'MemAvailable: 45177676 kB' 'Buffers: 8940 kB' 'Cached: 12562752 kB' 'SwapCached: 0 kB' 'Active: 9626644 kB' 'Inactive: 3663076 kB' 'Active(anon): 9222072 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 721460 kB' 'Mapped: 142048 kB' 'Shmem: 8504044 kB' 'KReclaimable: 228748 kB' 'Slab: 842460 kB' 'SReclaimable: 228748 kB' 'SUnreclaim: 613712 kB' 'KernelStack: 21744 kB' 'PageTables: 7652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10945224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.178 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.179 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.180 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:33.181 nr_hugepages=1024 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:33.181 resv_hugepages=0 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:33.181 surplus_hugepages=0 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:33.181 anon_hugepages=0 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41492520 kB' 'MemAvailable: 45177676 kB' 'Buffers: 8940 kB' 'Cached: 12562776 kB' 'SwapCached: 0 kB' 'Active: 9626576 kB' 'Inactive: 3663076 kB' 'Active(anon): 9222004 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 721352 kB' 'Mapped: 142048 kB' 'Shmem: 8504068 kB' 'KReclaimable: 228748 kB' 'Slab: 842460 kB' 'SReclaimable: 228748 kB' 'SUnreclaim: 613712 kB' 'KernelStack: 21728 kB' 'PageTables: 7572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10945248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.181 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.182 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.445 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25772140 kB' 'MemUsed: 6813228 kB' 'SwapCached: 0 kB' 'Active: 3144492 kB' 'Inactive: 143272 kB' 'Active(anon): 2968920 kB' 'Inactive(anon): 0 kB' 'Active(file): 175572 kB' 'Inactive(file): 143272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3106808 kB' 'Mapped: 47108 kB' 'AnonPages: 184208 kB' 'Shmem: 2787964 kB' 'KernelStack: 11560 kB' 'PageTables: 3012 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89672 kB' 'Slab: 379724 kB' 'SReclaimable: 89672 kB' 'SUnreclaim: 290052 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.446 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:33.447 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:33.448 node0=1024 expecting 1024 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:33.448 12:47:36 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:36.748 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:36.748 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:36.748 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:36.748 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:36.748 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:36.749 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:36.749 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41493424 kB' 'MemAvailable: 45178596 kB' 'Buffers: 8940 kB' 'Cached: 12563068 kB' 'SwapCached: 0 kB' 'Active: 9633424 kB' 'Inactive: 3663076 kB' 'Active(anon): 9228852 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 728648 kB' 'Mapped: 142028 kB' 'Shmem: 8504360 kB' 'KReclaimable: 228780 kB' 'Slab: 842044 kB' 'SReclaimable: 228780 kB' 'SUnreclaim: 613264 kB' 'KernelStack: 21984 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10948912 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214512 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.749 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:36.750 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41498320 kB' 'MemAvailable: 45183548 kB' 'Buffers: 8940 kB' 'Cached: 12563068 kB' 'SwapCached: 0 kB' 'Active: 9634284 kB' 'Inactive: 3663076 kB' 'Active(anon): 9229712 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 729000 kB' 'Mapped: 142036 kB' 'Shmem: 8504360 kB' 'KReclaimable: 228892 kB' 'Slab: 842244 kB' 'SReclaimable: 228892 kB' 'SUnreclaim: 613352 kB' 'KernelStack: 21792 kB' 'PageTables: 7668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10948932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.015 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.016 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41499816 kB' 'MemAvailable: 45185044 kB' 'Buffers: 8940 kB' 'Cached: 12563092 kB' 'SwapCached: 0 kB' 'Active: 9634248 kB' 'Inactive: 3663076 kB' 'Active(anon): 9229676 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 728904 kB' 'Mapped: 142020 kB' 'Shmem: 8504384 kB' 'KReclaimable: 228892 kB' 'Slab: 842248 kB' 'SReclaimable: 228892 kB' 'SUnreclaim: 613356 kB' 'KernelStack: 21648 kB' 'PageTables: 7352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10947332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.017 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.018 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:37.019 nr_hugepages=1024 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:37.019 resv_hugepages=0 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:37.019 surplus_hugepages=0 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:37.019 anon_hugepages=0 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 41499648 kB' 'MemAvailable: 45184876 kB' 'Buffers: 8940 kB' 'Cached: 12563112 kB' 'SwapCached: 0 kB' 'Active: 9634936 kB' 'Inactive: 3663076 kB' 'Active(anon): 9230364 kB' 'Inactive(anon): 0 kB' 'Active(file): 404572 kB' 'Inactive(file): 3663076 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 729600 kB' 'Mapped: 142040 kB' 'Shmem: 8504404 kB' 'KReclaimable: 228892 kB' 'Slab: 842248 kB' 'SReclaimable: 228892 kB' 'SUnreclaim: 613356 kB' 'KernelStack: 21808 kB' 'PageTables: 7656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10948976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 72576 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 531828 kB' 'DirectMap2M: 13834240 kB' 'DirectMap1G: 55574528 kB' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.019 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.020 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25768120 kB' 'MemUsed: 6817248 kB' 'SwapCached: 0 kB' 'Active: 3147736 kB' 'Inactive: 143272 kB' 'Active(anon): 2972164 kB' 'Inactive(anon): 0 kB' 'Active(file): 175572 kB' 'Inactive(file): 143272 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3107028 kB' 'Mapped: 47132 kB' 'AnonPages: 187304 kB' 'Shmem: 2788184 kB' 'KernelStack: 11816 kB' 'PageTables: 3552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89704 kB' 'Slab: 379424 kB' 'SReclaimable: 89704 kB' 'SUnreclaim: 289720 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.021 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:37.022 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:37.023 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:37.023 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:37.023 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:37.023 node0=1024 expecting 1024 00:05:37.023 12:47:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:37.023 00:05:37.023 real 0m7.402s 00:05:37.023 user 0m2.734s 00:05:37.023 sys 0m4.786s 00:05:37.023 12:47:40 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:37.023 12:47:40 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:37.023 ************************************ 00:05:37.023 END TEST no_shrink_alloc 00:05:37.023 ************************************ 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:37.023 12:47:40 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:37.023 00:05:37.023 real 0m24.795s 00:05:37.023 user 0m8.744s 00:05:37.023 sys 0m14.952s 00:05:37.023 12:47:40 setup.sh.hugepages -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:37.023 12:47:40 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:37.023 ************************************ 00:05:37.023 END TEST hugepages 00:05:37.023 ************************************ 00:05:37.023 12:47:40 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:37.023 12:47:40 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:37.023 12:47:40 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:37.023 12:47:40 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:37.284 ************************************ 00:05:37.284 START TEST driver 00:05:37.284 ************************************ 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:37.284 * Looking for test storage... 00:05:37.284 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1711 -- # lcov --version 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:37.284 12:47:40 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:37.284 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.284 --rc genhtml_branch_coverage=1 00:05:37.284 --rc genhtml_function_coverage=1 00:05:37.284 --rc genhtml_legend=1 00:05:37.284 --rc geninfo_all_blocks=1 00:05:37.284 --rc geninfo_unexecuted_blocks=1 00:05:37.284 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.284 ' 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:37.284 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.284 --rc genhtml_branch_coverage=1 00:05:37.284 --rc genhtml_function_coverage=1 00:05:37.284 --rc genhtml_legend=1 00:05:37.284 --rc geninfo_all_blocks=1 00:05:37.284 --rc geninfo_unexecuted_blocks=1 00:05:37.284 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.284 ' 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:37.284 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.284 --rc genhtml_branch_coverage=1 00:05:37.284 --rc genhtml_function_coverage=1 00:05:37.284 --rc genhtml_legend=1 00:05:37.284 --rc geninfo_all_blocks=1 00:05:37.284 --rc geninfo_unexecuted_blocks=1 00:05:37.284 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.284 ' 00:05:37.284 12:47:40 setup.sh.driver -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:37.284 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.284 --rc genhtml_branch_coverage=1 00:05:37.284 --rc genhtml_function_coverage=1 00:05:37.284 --rc genhtml_legend=1 00:05:37.284 --rc geninfo_all_blocks=1 00:05:37.284 --rc geninfo_unexecuted_blocks=1 00:05:37.284 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.284 ' 00:05:37.284 12:47:40 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:37.284 12:47:40 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:37.284 12:47:40 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:42.571 12:47:45 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:42.571 12:47:45 setup.sh.driver -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:42.571 12:47:45 setup.sh.driver -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:42.571 12:47:45 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:42.571 ************************************ 00:05:42.571 START TEST guess_driver 00:05:42.571 ************************************ 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- common/autotest_common.sh@1129 -- # guess_driver 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:42.571 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:42.571 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:42.571 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:42.571 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:42.571 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:42.571 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:42.571 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:42.571 Looking for driver=vfio-pci 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:42.571 12:47:45 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:45.865 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.865 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.865 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:45.866 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.125 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.125 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:46.126 12:47:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:47.509 12:47:50 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:47.509 12:47:50 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:47.509 12:47:50 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:47.769 12:47:50 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:47.769 12:47:50 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:47.769 12:47:50 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:47.769 12:47:50 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:53.058 00:05:53.058 real 0m10.233s 00:05:53.058 user 0m2.668s 00:05:53.058 sys 0m5.232s 00:05:53.058 12:47:55 setup.sh.driver.guess_driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:53.058 12:47:55 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:53.058 ************************************ 00:05:53.059 END TEST guess_driver 00:05:53.059 ************************************ 00:05:53.059 00:05:53.059 real 0m15.517s 00:05:53.059 user 0m4.168s 00:05:53.059 sys 0m8.190s 00:05:53.059 12:47:55 setup.sh.driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:53.059 12:47:55 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:53.059 ************************************ 00:05:53.059 END TEST driver 00:05:53.059 ************************************ 00:05:53.059 12:47:55 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:53.059 12:47:55 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:53.059 12:47:55 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:53.059 12:47:55 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:53.059 ************************************ 00:05:53.059 START TEST devices 00:05:53.059 ************************************ 00:05:53.059 12:47:55 setup.sh.devices -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:53.059 * Looking for test storage... 00:05:53.059 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1711 -- # lcov --version 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:53.059 12:47:56 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:53.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.059 --rc genhtml_branch_coverage=1 00:05:53.059 --rc genhtml_function_coverage=1 00:05:53.059 --rc genhtml_legend=1 00:05:53.059 --rc geninfo_all_blocks=1 00:05:53.059 --rc geninfo_unexecuted_blocks=1 00:05:53.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.059 ' 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:53.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.059 --rc genhtml_branch_coverage=1 00:05:53.059 --rc genhtml_function_coverage=1 00:05:53.059 --rc genhtml_legend=1 00:05:53.059 --rc geninfo_all_blocks=1 00:05:53.059 --rc geninfo_unexecuted_blocks=1 00:05:53.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.059 ' 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:53.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.059 --rc genhtml_branch_coverage=1 00:05:53.059 --rc genhtml_function_coverage=1 00:05:53.059 --rc genhtml_legend=1 00:05:53.059 --rc geninfo_all_blocks=1 00:05:53.059 --rc geninfo_unexecuted_blocks=1 00:05:53.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.059 ' 00:05:53.059 12:47:56 setup.sh.devices -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:53.059 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:53.059 --rc genhtml_branch_coverage=1 00:05:53.059 --rc genhtml_function_coverage=1 00:05:53.059 --rc genhtml_legend=1 00:05:53.059 --rc geninfo_all_blocks=1 00:05:53.059 --rc geninfo_unexecuted_blocks=1 00:05:53.059 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:53.059 ' 00:05:53.059 12:47:56 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:53.059 12:47:56 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:53.059 12:47:56 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:53.059 12:47:56 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1658 -- # zoned_ctrls=() 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1658 -- # local -A zoned_ctrls 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1659 -- # local nvme bdf ns 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1669 -- # bdf=0000:d8:00.0 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:57.262 12:47:59 setup.sh.devices -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:57.262 12:47:59 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:57.262 12:47:59 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:57.262 12:47:59 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:57.262 No valid GPT data, bailing 00:05:57.262 12:48:00 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:57.262 12:48:00 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:57.262 12:48:00 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:57.262 12:48:00 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:57.262 12:48:00 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:57.262 12:48:00 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:57.262 12:48:00 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:57.263 12:48:00 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:57.263 12:48:00 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:57.263 12:48:00 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:57.263 12:48:00 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:57.263 12:48:00 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:57.263 12:48:00 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:57.263 12:48:00 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:57.263 12:48:00 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:57.263 12:48:00 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:57.263 ************************************ 00:05:57.263 START TEST nvme_mount 00:05:57.263 ************************************ 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1129 -- # nvme_mount 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:57.263 12:48:00 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:57.833 Creating new GPT entries in memory. 00:05:57.833 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:57.833 other utilities. 00:05:57.833 12:48:01 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:57.833 12:48:01 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:57.833 12:48:01 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:57.833 12:48:01 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:57.833 12:48:01 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:59.215 Creating new GPT entries in memory. 00:05:59.215 The operation has completed successfully. 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 117945 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:59.215 12:48:02 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.509 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:02.510 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:02.510 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:02.768 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:02.768 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:02.768 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:02.768 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:02.768 12:48:05 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:02.768 12:48:05 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:02.768 12:48:05 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.768 12:48:05 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:02.768 12:48:05 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:02.768 12:48:05 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:02.768 12:48:06 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:06.059 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.059 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.059 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.059 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:06.060 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:06.320 12:48:09 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:09.632 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:09.632 00:06:09.632 real 0m12.802s 00:06:09.632 user 0m3.785s 00:06:09.632 sys 0m6.968s 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:09.632 12:48:12 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:09.632 ************************************ 00:06:09.632 END TEST nvme_mount 00:06:09.632 ************************************ 00:06:09.632 12:48:12 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:09.632 12:48:12 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:09.632 12:48:12 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:09.632 12:48:12 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:09.632 ************************************ 00:06:09.632 START TEST dm_mount 00:06:09.632 ************************************ 00:06:09.632 12:48:12 setup.sh.devices.dm_mount -- common/autotest_common.sh@1129 -- # dm_mount 00:06:09.632 12:48:12 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:09.894 12:48:12 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:10.836 Creating new GPT entries in memory. 00:06:10.836 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:10.836 other utilities. 00:06:10.836 12:48:13 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:10.836 12:48:13 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:10.836 12:48:13 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:10.836 12:48:13 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:10.836 12:48:13 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:11.776 Creating new GPT entries in memory. 00:06:11.776 The operation has completed successfully. 00:06:11.776 12:48:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:11.776 12:48:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:11.776 12:48:14 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:11.776 12:48:14 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:11.776 12:48:14 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:12.716 The operation has completed successfully. 00:06:12.716 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:12.716 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:12.716 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 123039 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:12.977 12:48:16 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:16.299 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:16.300 12:48:19 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:19.597 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.597 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.597 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.597 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.597 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:19.598 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:19.858 12:48:22 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:19.858 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:19.858 12:48:23 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:19.858 12:48:23 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:19.858 00:06:19.858 real 0m10.074s 00:06:19.858 user 0m2.537s 00:06:19.858 sys 0m4.633s 00:06:19.858 12:48:23 setup.sh.devices.dm_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:19.858 12:48:23 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:19.858 ************************************ 00:06:19.858 END TEST dm_mount 00:06:19.858 ************************************ 00:06:19.858 12:48:23 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:19.858 12:48:23 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:19.858 12:48:23 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:19.858 12:48:23 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:19.858 12:48:23 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:19.858 12:48:23 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:19.858 12:48:23 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:20.119 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:20.119 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:20.119 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:20.119 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:20.119 12:48:23 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:20.119 12:48:23 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:20.119 12:48:23 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:20.119 12:48:23 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:20.119 12:48:23 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:20.119 12:48:23 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:20.119 12:48:23 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:20.119 00:06:20.119 real 0m27.400s 00:06:20.119 user 0m7.914s 00:06:20.119 sys 0m14.463s 00:06:20.119 12:48:23 setup.sh.devices -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:20.119 12:48:23 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:20.119 ************************************ 00:06:20.119 END TEST devices 00:06:20.119 ************************************ 00:06:20.119 00:06:20.119 real 1m33.748s 00:06:20.119 user 0m29.245s 00:06:20.119 sys 0m53.495s 00:06:20.119 12:48:23 setup.sh -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:20.119 12:48:23 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:20.119 ************************************ 00:06:20.119 END TEST setup.sh 00:06:20.119 ************************************ 00:06:20.379 12:48:23 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:23.676 Hugepages 00:06:23.676 node hugesize free / total 00:06:23.676 node0 1048576kB 0 / 0 00:06:23.676 node0 2048kB 1024 / 1024 00:06:23.676 node1 1048576kB 0 / 0 00:06:23.676 node1 2048kB 1024 / 1024 00:06:23.676 00:06:23.676 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:23.676 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:23.676 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:23.676 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:23.676 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:23.676 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:23.676 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:23.676 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:23.676 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:23.676 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:23.676 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:23.676 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:23.676 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:23.676 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:23.676 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:23.676 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:23.676 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:23.676 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:23.676 12:48:26 -- spdk/autotest.sh@117 -- # uname -s 00:06:23.676 12:48:26 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:23.676 12:48:26 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:23.676 12:48:26 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:26.974 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:26.974 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:26.974 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:26.974 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:27.234 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:28.616 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:28.876 12:48:32 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:29.815 12:48:33 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:29.815 12:48:33 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:29.815 12:48:33 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:06:29.815 12:48:33 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:06:29.815 12:48:33 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:29.815 12:48:33 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:29.815 12:48:33 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:29.815 12:48:33 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:29.815 12:48:33 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:30.075 12:48:33 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:30.075 12:48:33 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:30.075 12:48:33 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:33.372 Waiting for block devices as requested 00:06:33.372 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:33.372 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:33.633 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:33.633 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:33.633 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:33.893 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:33.893 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:33.893 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:34.154 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:34.154 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:34.154 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:34.415 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:34.415 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:34.415 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:34.676 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:34.676 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:34.676 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:34.937 12:48:38 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:06:34.937 12:48:38 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:34.937 12:48:38 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:34.937 12:48:38 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:06:34.937 12:48:38 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:34.937 12:48:38 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:34.937 12:48:38 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:34.937 12:48:38 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:34.937 12:48:38 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:06:34.937 12:48:38 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:06:34.937 12:48:38 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:06:34.937 12:48:38 -- common/autotest_common.sh@1531 -- # grep oacs 00:06:34.937 12:48:38 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:06:34.937 12:48:38 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:06:34.937 12:48:38 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:06:34.937 12:48:38 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:06:34.937 12:48:38 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:06:34.937 12:48:38 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:06:34.937 12:48:38 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:06:34.937 12:48:38 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:06:34.937 12:48:38 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:06:34.937 12:48:38 -- common/autotest_common.sh@1543 -- # continue 00:06:34.937 12:48:38 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:34.937 12:48:38 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:34.937 12:48:38 -- common/autotest_common.sh@10 -- # set +x 00:06:34.937 12:48:38 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:34.937 12:48:38 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:34.937 12:48:38 -- common/autotest_common.sh@10 -- # set +x 00:06:34.937 12:48:38 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:39.172 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:39.172 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:40.115 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:40.376 12:48:43 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:40.376 12:48:43 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:40.376 12:48:43 -- common/autotest_common.sh@10 -- # set +x 00:06:40.376 12:48:43 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:40.376 12:48:43 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:06:40.376 12:48:43 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:06:40.376 12:48:43 -- common/autotest_common.sh@1563 -- # bdfs=() 00:06:40.376 12:48:43 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:06:40.376 12:48:43 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:06:40.376 12:48:43 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:06:40.376 12:48:43 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:06:40.376 12:48:43 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:40.376 12:48:43 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:40.376 12:48:43 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:40.376 12:48:43 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:40.376 12:48:43 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:40.376 12:48:43 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:40.376 12:48:43 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:40.376 12:48:43 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:06:40.376 12:48:43 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:40.376 12:48:43 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:06:40.376 12:48:43 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:40.376 12:48:43 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:06:40.376 12:48:43 -- common/autotest_common.sh@1572 -- # (( 1 > 0 )) 00:06:40.377 12:48:43 -- common/autotest_common.sh@1573 -- # printf '%s\n' 0000:d8:00.0 00:06:40.377 12:48:43 -- common/autotest_common.sh@1579 -- # [[ -z 0000:d8:00.0 ]] 00:06:40.638 12:48:43 -- common/autotest_common.sh@1584 -- # spdk_tgt_pid=132972 00:06:40.638 12:48:43 -- common/autotest_common.sh@1585 -- # waitforlisten 132972 00:06:40.638 12:48:43 -- common/autotest_common.sh@1583 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:40.638 12:48:43 -- common/autotest_common.sh@835 -- # '[' -z 132972 ']' 00:06:40.638 12:48:43 -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:40.638 12:48:43 -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:40.638 12:48:43 -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:40.638 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:40.638 12:48:43 -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:40.638 12:48:43 -- common/autotest_common.sh@10 -- # set +x 00:06:40.638 [2024-12-05 12:48:43.725538] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:06:40.638 [2024-12-05 12:48:43.725588] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid132972 ] 00:06:40.638 [2024-12-05 12:48:43.810255] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.638 [2024-12-05 12:48:43.832992] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.899 12:48:44 -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:40.899 12:48:44 -- common/autotest_common.sh@868 -- # return 0 00:06:40.899 12:48:44 -- common/autotest_common.sh@1587 -- # bdf_id=0 00:06:40.899 12:48:44 -- common/autotest_common.sh@1588 -- # for bdf in "${bdfs[@]}" 00:06:40.899 12:48:44 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:44.193 nvme0n1 00:06:44.193 12:48:47 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:44.193 [2024-12-05 12:48:47.239215] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:44.193 request: 00:06:44.193 { 00:06:44.193 "nvme_ctrlr_name": "nvme0", 00:06:44.193 "password": "test", 00:06:44.193 "method": "bdev_nvme_opal_revert", 00:06:44.193 "req_id": 1 00:06:44.193 } 00:06:44.193 Got JSON-RPC error response 00:06:44.193 response: 00:06:44.193 { 00:06:44.193 "code": -32602, 00:06:44.193 "message": "Invalid parameters" 00:06:44.193 } 00:06:44.193 12:48:47 -- common/autotest_common.sh@1591 -- # true 00:06:44.193 12:48:47 -- common/autotest_common.sh@1592 -- # (( ++bdf_id )) 00:06:44.193 12:48:47 -- common/autotest_common.sh@1595 -- # killprocess 132972 00:06:44.193 12:48:47 -- common/autotest_common.sh@954 -- # '[' -z 132972 ']' 00:06:44.194 12:48:47 -- common/autotest_common.sh@958 -- # kill -0 132972 00:06:44.194 12:48:47 -- common/autotest_common.sh@959 -- # uname 00:06:44.194 12:48:47 -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:44.194 12:48:47 -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 132972 00:06:44.194 12:48:47 -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:44.194 12:48:47 -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:44.194 12:48:47 -- common/autotest_common.sh@972 -- # echo 'killing process with pid 132972' 00:06:44.194 killing process with pid 132972 00:06:44.194 12:48:47 -- common/autotest_common.sh@973 -- # kill 132972 00:06:44.194 12:48:47 -- common/autotest_common.sh@978 -- # wait 132972 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.194 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:44.195 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:46.737 12:48:49 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:46.737 12:48:49 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:46.737 12:48:49 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:46.737 12:48:49 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:46.737 12:48:49 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:46.737 12:48:49 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:46.737 12:48:49 -- common/autotest_common.sh@10 -- # set +x 00:06:46.737 12:48:49 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:46.737 12:48:49 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:46.737 12:48:49 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.737 12:48:49 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.737 12:48:49 -- common/autotest_common.sh@10 -- # set +x 00:06:46.737 ************************************ 00:06:46.737 START TEST env 00:06:46.737 ************************************ 00:06:46.737 12:48:49 env -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:46.737 * Looking for test storage... 00:06:46.737 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:46.737 12:48:49 env -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:06:46.737 12:48:49 env -- common/autotest_common.sh@1711 -- # lcov --version 00:06:46.737 12:48:49 env -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:06:46.737 12:48:49 env -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:06:46.737 12:48:49 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:46.737 12:48:49 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:46.737 12:48:49 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:46.737 12:48:49 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:46.737 12:48:49 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:46.737 12:48:49 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:46.737 12:48:49 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:46.737 12:48:49 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:46.737 12:48:49 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:46.737 12:48:49 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:46.737 12:48:49 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:46.737 12:48:49 env -- scripts/common.sh@344 -- # case "$op" in 00:06:46.737 12:48:49 env -- scripts/common.sh@345 -- # : 1 00:06:46.737 12:48:49 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:46.737 12:48:49 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:46.737 12:48:49 env -- scripts/common.sh@365 -- # decimal 1 00:06:46.737 12:48:49 env -- scripts/common.sh@353 -- # local d=1 00:06:46.737 12:48:49 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:46.737 12:48:49 env -- scripts/common.sh@355 -- # echo 1 00:06:46.737 12:48:49 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:46.737 12:48:49 env -- scripts/common.sh@366 -- # decimal 2 00:06:46.737 12:48:49 env -- scripts/common.sh@353 -- # local d=2 00:06:46.737 12:48:49 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:46.737 12:48:49 env -- scripts/common.sh@355 -- # echo 2 00:06:46.737 12:48:49 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:46.737 12:48:49 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:46.737 12:48:49 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:46.737 12:48:49 env -- scripts/common.sh@368 -- # return 0 00:06:46.737 12:48:49 env -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:46.737 12:48:49 env -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:06:46.737 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.737 --rc genhtml_branch_coverage=1 00:06:46.737 --rc genhtml_function_coverage=1 00:06:46.737 --rc genhtml_legend=1 00:06:46.737 --rc geninfo_all_blocks=1 00:06:46.737 --rc geninfo_unexecuted_blocks=1 00:06:46.737 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.737 ' 00:06:46.737 12:48:49 env -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:06:46.737 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.737 --rc genhtml_branch_coverage=1 00:06:46.737 --rc genhtml_function_coverage=1 00:06:46.737 --rc genhtml_legend=1 00:06:46.737 --rc geninfo_all_blocks=1 00:06:46.737 --rc geninfo_unexecuted_blocks=1 00:06:46.738 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.738 ' 00:06:46.738 12:48:49 env -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:06:46.738 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.738 --rc genhtml_branch_coverage=1 00:06:46.738 --rc genhtml_function_coverage=1 00:06:46.738 --rc genhtml_legend=1 00:06:46.738 --rc geninfo_all_blocks=1 00:06:46.738 --rc geninfo_unexecuted_blocks=1 00:06:46.738 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.738 ' 00:06:46.738 12:48:49 env -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:06:46.738 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.738 --rc genhtml_branch_coverage=1 00:06:46.738 --rc genhtml_function_coverage=1 00:06:46.738 --rc genhtml_legend=1 00:06:46.738 --rc geninfo_all_blocks=1 00:06:46.738 --rc geninfo_unexecuted_blocks=1 00:06:46.738 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.738 ' 00:06:46.738 12:48:49 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:46.738 12:48:49 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.738 12:48:49 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.738 12:48:49 env -- common/autotest_common.sh@10 -- # set +x 00:06:46.738 ************************************ 00:06:46.738 START TEST env_memory 00:06:46.738 ************************************ 00:06:46.738 12:48:49 env.env_memory -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:46.738 00:06:46.738 00:06:46.738 CUnit - A unit testing framework for C - Version 2.1-3 00:06:46.738 http://cunit.sourceforge.net/ 00:06:46.738 00:06:46.738 00:06:46.738 Suite: memory 00:06:46.738 Test: alloc and free memory map ...[2024-12-05 12:48:49.775611] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:46.738 passed 00:06:46.738 Test: mem map translation ...[2024-12-05 12:48:49.788349] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:46.738 [2024-12-05 12:48:49.788365] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:46.738 [2024-12-05 12:48:49.788397] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:46.738 [2024-12-05 12:48:49.788406] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:46.738 passed 00:06:46.738 Test: mem map registration ...[2024-12-05 12:48:49.808643] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:46.738 [2024-12-05 12:48:49.808659] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:46.738 passed 00:06:46.738 Test: mem map adjacent registrations ...passed 00:06:46.738 00:06:46.738 Run Summary: Type Total Ran Passed Failed Inactive 00:06:46.738 suites 1 1 n/a 0 0 00:06:46.738 tests 4 4 4 0 0 00:06:46.738 asserts 152 152 152 0 n/a 00:06:46.738 00:06:46.738 Elapsed time = 0.072 seconds 00:06:46.738 00:06:46.738 real 0m0.081s 00:06:46.738 user 0m0.068s 00:06:46.738 sys 0m0.012s 00:06:46.738 12:48:49 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:46.738 12:48:49 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:46.738 ************************************ 00:06:46.738 END TEST env_memory 00:06:46.738 ************************************ 00:06:46.738 12:48:49 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:46.738 12:48:49 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.738 12:48:49 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.738 12:48:49 env -- common/autotest_common.sh@10 -- # set +x 00:06:46.738 ************************************ 00:06:46.738 START TEST env_vtophys 00:06:46.738 ************************************ 00:06:46.738 12:48:49 env.env_vtophys -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:46.738 EAL: lib.eal log level changed from notice to debug 00:06:46.738 EAL: Detected lcore 0 as core 0 on socket 0 00:06:46.738 EAL: Detected lcore 1 as core 1 on socket 0 00:06:46.738 EAL: Detected lcore 2 as core 2 on socket 0 00:06:46.738 EAL: Detected lcore 3 as core 3 on socket 0 00:06:46.738 EAL: Detected lcore 4 as core 4 on socket 0 00:06:46.738 EAL: Detected lcore 5 as core 5 on socket 0 00:06:46.738 EAL: Detected lcore 6 as core 6 on socket 0 00:06:46.738 EAL: Detected lcore 7 as core 8 on socket 0 00:06:46.738 EAL: Detected lcore 8 as core 9 on socket 0 00:06:46.738 EAL: Detected lcore 9 as core 10 on socket 0 00:06:46.738 EAL: Detected lcore 10 as core 11 on socket 0 00:06:46.738 EAL: Detected lcore 11 as core 12 on socket 0 00:06:46.738 EAL: Detected lcore 12 as core 13 on socket 0 00:06:46.738 EAL: Detected lcore 13 as core 14 on socket 0 00:06:46.738 EAL: Detected lcore 14 as core 16 on socket 0 00:06:46.738 EAL: Detected lcore 15 as core 17 on socket 0 00:06:46.738 EAL: Detected lcore 16 as core 18 on socket 0 00:06:46.738 EAL: Detected lcore 17 as core 19 on socket 0 00:06:46.738 EAL: Detected lcore 18 as core 20 on socket 0 00:06:46.738 EAL: Detected lcore 19 as core 21 on socket 0 00:06:46.738 EAL: Detected lcore 20 as core 22 on socket 0 00:06:46.738 EAL: Detected lcore 21 as core 24 on socket 0 00:06:46.738 EAL: Detected lcore 22 as core 25 on socket 0 00:06:46.738 EAL: Detected lcore 23 as core 26 on socket 0 00:06:46.738 EAL: Detected lcore 24 as core 27 on socket 0 00:06:46.738 EAL: Detected lcore 25 as core 28 on socket 0 00:06:46.738 EAL: Detected lcore 26 as core 29 on socket 0 00:06:46.738 EAL: Detected lcore 27 as core 30 on socket 0 00:06:46.738 EAL: Detected lcore 28 as core 0 on socket 1 00:06:46.738 EAL: Detected lcore 29 as core 1 on socket 1 00:06:46.738 EAL: Detected lcore 30 as core 2 on socket 1 00:06:46.738 EAL: Detected lcore 31 as core 3 on socket 1 00:06:46.738 EAL: Detected lcore 32 as core 4 on socket 1 00:06:46.738 EAL: Detected lcore 33 as core 5 on socket 1 00:06:46.738 EAL: Detected lcore 34 as core 6 on socket 1 00:06:46.738 EAL: Detected lcore 35 as core 8 on socket 1 00:06:46.738 EAL: Detected lcore 36 as core 9 on socket 1 00:06:46.738 EAL: Detected lcore 37 as core 10 on socket 1 00:06:46.738 EAL: Detected lcore 38 as core 11 on socket 1 00:06:46.738 EAL: Detected lcore 39 as core 12 on socket 1 00:06:46.738 EAL: Detected lcore 40 as core 13 on socket 1 00:06:46.738 EAL: Detected lcore 41 as core 14 on socket 1 00:06:46.738 EAL: Detected lcore 42 as core 16 on socket 1 00:06:46.738 EAL: Detected lcore 43 as core 17 on socket 1 00:06:46.738 EAL: Detected lcore 44 as core 18 on socket 1 00:06:46.738 EAL: Detected lcore 45 as core 19 on socket 1 00:06:46.738 EAL: Detected lcore 46 as core 20 on socket 1 00:06:46.738 EAL: Detected lcore 47 as core 21 on socket 1 00:06:46.738 EAL: Detected lcore 48 as core 22 on socket 1 00:06:46.738 EAL: Detected lcore 49 as core 24 on socket 1 00:06:46.738 EAL: Detected lcore 50 as core 25 on socket 1 00:06:46.738 EAL: Detected lcore 51 as core 26 on socket 1 00:06:46.738 EAL: Detected lcore 52 as core 27 on socket 1 00:06:46.738 EAL: Detected lcore 53 as core 28 on socket 1 00:06:46.739 EAL: Detected lcore 54 as core 29 on socket 1 00:06:46.739 EAL: Detected lcore 55 as core 30 on socket 1 00:06:46.739 EAL: Detected lcore 56 as core 0 on socket 0 00:06:46.739 EAL: Detected lcore 57 as core 1 on socket 0 00:06:46.739 EAL: Detected lcore 58 as core 2 on socket 0 00:06:46.739 EAL: Detected lcore 59 as core 3 on socket 0 00:06:46.739 EAL: Detected lcore 60 as core 4 on socket 0 00:06:46.739 EAL: Detected lcore 61 as core 5 on socket 0 00:06:46.739 EAL: Detected lcore 62 as core 6 on socket 0 00:06:46.739 EAL: Detected lcore 63 as core 8 on socket 0 00:06:46.739 EAL: Detected lcore 64 as core 9 on socket 0 00:06:46.739 EAL: Detected lcore 65 as core 10 on socket 0 00:06:46.739 EAL: Detected lcore 66 as core 11 on socket 0 00:06:46.739 EAL: Detected lcore 67 as core 12 on socket 0 00:06:46.739 EAL: Detected lcore 68 as core 13 on socket 0 00:06:46.739 EAL: Detected lcore 69 as core 14 on socket 0 00:06:46.739 EAL: Detected lcore 70 as core 16 on socket 0 00:06:46.739 EAL: Detected lcore 71 as core 17 on socket 0 00:06:46.739 EAL: Detected lcore 72 as core 18 on socket 0 00:06:46.739 EAL: Detected lcore 73 as core 19 on socket 0 00:06:46.739 EAL: Detected lcore 74 as core 20 on socket 0 00:06:46.739 EAL: Detected lcore 75 as core 21 on socket 0 00:06:46.739 EAL: Detected lcore 76 as core 22 on socket 0 00:06:46.739 EAL: Detected lcore 77 as core 24 on socket 0 00:06:46.739 EAL: Detected lcore 78 as core 25 on socket 0 00:06:46.739 EAL: Detected lcore 79 as core 26 on socket 0 00:06:46.739 EAL: Detected lcore 80 as core 27 on socket 0 00:06:46.739 EAL: Detected lcore 81 as core 28 on socket 0 00:06:46.739 EAL: Detected lcore 82 as core 29 on socket 0 00:06:46.739 EAL: Detected lcore 83 as core 30 on socket 0 00:06:46.739 EAL: Detected lcore 84 as core 0 on socket 1 00:06:46.739 EAL: Detected lcore 85 as core 1 on socket 1 00:06:46.739 EAL: Detected lcore 86 as core 2 on socket 1 00:06:46.739 EAL: Detected lcore 87 as core 3 on socket 1 00:06:46.739 EAL: Detected lcore 88 as core 4 on socket 1 00:06:46.739 EAL: Detected lcore 89 as core 5 on socket 1 00:06:46.739 EAL: Detected lcore 90 as core 6 on socket 1 00:06:46.739 EAL: Detected lcore 91 as core 8 on socket 1 00:06:46.739 EAL: Detected lcore 92 as core 9 on socket 1 00:06:46.739 EAL: Detected lcore 93 as core 10 on socket 1 00:06:46.739 EAL: Detected lcore 94 as core 11 on socket 1 00:06:46.739 EAL: Detected lcore 95 as core 12 on socket 1 00:06:46.739 EAL: Detected lcore 96 as core 13 on socket 1 00:06:46.739 EAL: Detected lcore 97 as core 14 on socket 1 00:06:46.739 EAL: Detected lcore 98 as core 16 on socket 1 00:06:46.739 EAL: Detected lcore 99 as core 17 on socket 1 00:06:46.739 EAL: Detected lcore 100 as core 18 on socket 1 00:06:46.739 EAL: Detected lcore 101 as core 19 on socket 1 00:06:46.739 EAL: Detected lcore 102 as core 20 on socket 1 00:06:46.739 EAL: Detected lcore 103 as core 21 on socket 1 00:06:46.739 EAL: Detected lcore 104 as core 22 on socket 1 00:06:46.739 EAL: Detected lcore 105 as core 24 on socket 1 00:06:46.739 EAL: Detected lcore 106 as core 25 on socket 1 00:06:46.739 EAL: Detected lcore 107 as core 26 on socket 1 00:06:46.739 EAL: Detected lcore 108 as core 27 on socket 1 00:06:46.739 EAL: Detected lcore 109 as core 28 on socket 1 00:06:46.739 EAL: Detected lcore 110 as core 29 on socket 1 00:06:46.739 EAL: Detected lcore 111 as core 30 on socket 1 00:06:46.739 EAL: Maximum logical cores by configuration: 128 00:06:46.739 EAL: Detected CPU lcores: 112 00:06:46.739 EAL: Detected NUMA nodes: 2 00:06:46.739 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:06:46.739 EAL: Checking presence of .so 'librte_eal.so.23' 00:06:46.739 EAL: Checking presence of .so 'librte_eal.so' 00:06:46.739 EAL: Detected static linkage of DPDK 00:06:46.739 EAL: No shared files mode enabled, IPC will be disabled 00:06:46.739 EAL: Bus pci wants IOVA as 'DC' 00:06:46.739 EAL: Buses did not request a specific IOVA mode. 00:06:46.739 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:46.739 EAL: Selected IOVA mode 'VA' 00:06:46.739 EAL: Probing VFIO support... 00:06:46.739 EAL: IOMMU type 1 (Type 1) is supported 00:06:46.739 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:46.739 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:46.739 EAL: VFIO support initialized 00:06:46.739 EAL: Ask a virtual area of 0x2e000 bytes 00:06:46.739 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:46.739 EAL: Setting up physically contiguous memory... 00:06:46.739 EAL: Setting maximum number of open files to 524288 00:06:46.739 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:46.739 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:46.739 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:46.739 EAL: Ask a virtual area of 0x61000 bytes 00:06:46.739 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:46.739 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:46.739 EAL: Ask a virtual area of 0x400000000 bytes 00:06:46.739 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:46.739 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:46.739 EAL: Ask a virtual area of 0x61000 bytes 00:06:46.739 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:46.739 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:46.739 EAL: Ask a virtual area of 0x400000000 bytes 00:06:46.739 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:46.739 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:46.739 EAL: Ask a virtual area of 0x61000 bytes 00:06:46.739 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:46.739 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:46.739 EAL: Ask a virtual area of 0x400000000 bytes 00:06:46.739 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:46.739 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:46.739 EAL: Ask a virtual area of 0x61000 bytes 00:06:46.739 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:46.739 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:46.739 EAL: Ask a virtual area of 0x400000000 bytes 00:06:46.739 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:46.739 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:46.739 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:46.739 EAL: Ask a virtual area of 0x61000 bytes 00:06:46.739 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:46.739 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:46.739 EAL: Ask a virtual area of 0x400000000 bytes 00:06:46.739 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:46.739 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:46.739 EAL: Ask a virtual area of 0x61000 bytes 00:06:46.739 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:46.739 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:46.739 EAL: Ask a virtual area of 0x400000000 bytes 00:06:46.739 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:46.739 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:46.739 EAL: Ask a virtual area of 0x61000 bytes 00:06:46.739 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:46.739 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:46.739 EAL: Ask a virtual area of 0x400000000 bytes 00:06:46.739 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:46.739 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:46.739 EAL: Ask a virtual area of 0x61000 bytes 00:06:46.739 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:46.740 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:46.740 EAL: Ask a virtual area of 0x400000000 bytes 00:06:46.740 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:46.740 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:46.740 EAL: Hugepages will be freed exactly as allocated. 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: TSC frequency is ~2500000 KHz 00:06:46.740 EAL: Main lcore 0 is ready (tid=7fd05d08ea00;cpuset=[0]) 00:06:46.740 EAL: Trying to obtain current memory policy. 00:06:46.740 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:46.740 EAL: Restoring previous memory policy: 0 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was expanded by 2MB 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Mem event callback 'spdk:(nil)' registered 00:06:46.740 00:06:46.740 00:06:46.740 CUnit - A unit testing framework for C - Version 2.1-3 00:06:46.740 http://cunit.sourceforge.net/ 00:06:46.740 00:06:46.740 00:06:46.740 Suite: components_suite 00:06:46.740 Test: vtophys_malloc_test ...passed 00:06:46.740 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:46.740 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:46.740 EAL: Restoring previous memory policy: 4 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was expanded by 4MB 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was shrunk by 4MB 00:06:46.740 EAL: Trying to obtain current memory policy. 00:06:46.740 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:46.740 EAL: Restoring previous memory policy: 4 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was expanded by 6MB 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was shrunk by 6MB 00:06:46.740 EAL: Trying to obtain current memory policy. 00:06:46.740 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:46.740 EAL: Restoring previous memory policy: 4 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was expanded by 10MB 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was shrunk by 10MB 00:06:46.740 EAL: Trying to obtain current memory policy. 00:06:46.740 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:46.740 EAL: Restoring previous memory policy: 4 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was expanded by 18MB 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was shrunk by 18MB 00:06:46.740 EAL: Trying to obtain current memory policy. 00:06:46.740 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:46.740 EAL: Restoring previous memory policy: 4 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was expanded by 34MB 00:06:46.740 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.740 EAL: request: mp_malloc_sync 00:06:46.740 EAL: No shared files mode enabled, IPC is disabled 00:06:46.740 EAL: Heap on socket 0 was shrunk by 34MB 00:06:46.740 EAL: Trying to obtain current memory policy. 00:06:46.740 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.001 EAL: Restoring previous memory policy: 4 00:06:47.001 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.001 EAL: request: mp_malloc_sync 00:06:47.001 EAL: No shared files mode enabled, IPC is disabled 00:06:47.001 EAL: Heap on socket 0 was expanded by 66MB 00:06:47.001 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.001 EAL: request: mp_malloc_sync 00:06:47.001 EAL: No shared files mode enabled, IPC is disabled 00:06:47.001 EAL: Heap on socket 0 was shrunk by 66MB 00:06:47.001 EAL: Trying to obtain current memory policy. 00:06:47.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.001 EAL: Restoring previous memory policy: 4 00:06:47.001 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.001 EAL: request: mp_malloc_sync 00:06:47.001 EAL: No shared files mode enabled, IPC is disabled 00:06:47.001 EAL: Heap on socket 0 was expanded by 130MB 00:06:47.001 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.001 EAL: request: mp_malloc_sync 00:06:47.001 EAL: No shared files mode enabled, IPC is disabled 00:06:47.001 EAL: Heap on socket 0 was shrunk by 130MB 00:06:47.001 EAL: Trying to obtain current memory policy. 00:06:47.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.001 EAL: Restoring previous memory policy: 4 00:06:47.001 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.001 EAL: request: mp_malloc_sync 00:06:47.001 EAL: No shared files mode enabled, IPC is disabled 00:06:47.001 EAL: Heap on socket 0 was expanded by 258MB 00:06:47.001 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.001 EAL: request: mp_malloc_sync 00:06:47.001 EAL: No shared files mode enabled, IPC is disabled 00:06:47.001 EAL: Heap on socket 0 was shrunk by 258MB 00:06:47.001 EAL: Trying to obtain current memory policy. 00:06:47.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.261 EAL: Restoring previous memory policy: 4 00:06:47.261 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.261 EAL: request: mp_malloc_sync 00:06:47.261 EAL: No shared files mode enabled, IPC is disabled 00:06:47.262 EAL: Heap on socket 0 was expanded by 514MB 00:06:47.262 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.262 EAL: request: mp_malloc_sync 00:06:47.262 EAL: No shared files mode enabled, IPC is disabled 00:06:47.262 EAL: Heap on socket 0 was shrunk by 514MB 00:06:47.262 EAL: Trying to obtain current memory policy. 00:06:47.262 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:47.521 EAL: Restoring previous memory policy: 4 00:06:47.521 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.521 EAL: request: mp_malloc_sync 00:06:47.521 EAL: No shared files mode enabled, IPC is disabled 00:06:47.521 EAL: Heap on socket 0 was expanded by 1026MB 00:06:47.782 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.782 EAL: request: mp_malloc_sync 00:06:47.782 EAL: No shared files mode enabled, IPC is disabled 00:06:47.782 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:47.782 passed 00:06:47.782 00:06:47.782 Run Summary: Type Total Ran Passed Failed Inactive 00:06:47.782 suites 1 1 n/a 0 0 00:06:47.782 tests 2 2 2 0 0 00:06:47.782 asserts 497 497 497 0 n/a 00:06:47.782 00:06:47.782 Elapsed time = 0.981 seconds 00:06:47.782 EAL: Calling mem event callback 'spdk:(nil)' 00:06:47.782 EAL: request: mp_malloc_sync 00:06:47.782 EAL: No shared files mode enabled, IPC is disabled 00:06:47.782 EAL: Heap on socket 0 was shrunk by 2MB 00:06:47.782 EAL: No shared files mode enabled, IPC is disabled 00:06:47.782 EAL: No shared files mode enabled, IPC is disabled 00:06:47.782 EAL: No shared files mode enabled, IPC is disabled 00:06:47.782 00:06:47.782 real 0m1.110s 00:06:47.782 user 0m0.642s 00:06:47.782 sys 0m0.444s 00:06:47.782 12:48:51 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.782 12:48:51 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:47.782 ************************************ 00:06:47.782 END TEST env_vtophys 00:06:47.782 ************************************ 00:06:47.782 12:48:51 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:47.782 12:48:51 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.782 12:48:51 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.782 12:48:51 env -- common/autotest_common.sh@10 -- # set +x 00:06:48.043 ************************************ 00:06:48.043 START TEST env_pci 00:06:48.043 ************************************ 00:06:48.043 12:48:51 env.env_pci -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:48.043 00:06:48.043 00:06:48.043 CUnit - A unit testing framework for C - Version 2.1-3 00:06:48.043 http://cunit.sourceforge.net/ 00:06:48.043 00:06:48.043 00:06:48.043 Suite: pci 00:06:48.043 Test: pci_hook ...[2024-12-05 12:48:51.125087] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1118:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 134260 has claimed it 00:06:48.043 EAL: Cannot find device (10000:00:01.0) 00:06:48.043 EAL: Failed to attach device on primary process 00:06:48.043 passed 00:06:48.043 00:06:48.043 Run Summary: Type Total Ran Passed Failed Inactive 00:06:48.043 suites 1 1 n/a 0 0 00:06:48.043 tests 1 1 1 0 0 00:06:48.043 asserts 25 25 25 0 n/a 00:06:48.043 00:06:48.043 Elapsed time = 0.034 seconds 00:06:48.043 00:06:48.043 real 0m0.052s 00:06:48.043 user 0m0.007s 00:06:48.043 sys 0m0.045s 00:06:48.043 12:48:51 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:48.043 12:48:51 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:48.043 ************************************ 00:06:48.043 END TEST env_pci 00:06:48.043 ************************************ 00:06:48.043 12:48:51 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:48.043 12:48:51 env -- env/env.sh@15 -- # uname 00:06:48.043 12:48:51 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:48.043 12:48:51 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:48.043 12:48:51 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:48.043 12:48:51 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:06:48.043 12:48:51 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:48.043 12:48:51 env -- common/autotest_common.sh@10 -- # set +x 00:06:48.043 ************************************ 00:06:48.043 START TEST env_dpdk_post_init 00:06:48.043 ************************************ 00:06:48.043 12:48:51 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:48.043 EAL: Detected CPU lcores: 112 00:06:48.043 EAL: Detected NUMA nodes: 2 00:06:48.043 EAL: Detected static linkage of DPDK 00:06:48.043 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:48.043 EAL: Selected IOVA mode 'VA' 00:06:48.043 EAL: VFIO support initialized 00:06:48.043 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:48.303 EAL: Using IOMMU type 1 (Type 1) 00:06:48.874 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:53.075 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:53.075 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:53.075 Starting DPDK initialization... 00:06:53.075 Starting SPDK post initialization... 00:06:53.075 SPDK NVMe probe 00:06:53.075 Attaching to 0000:d8:00.0 00:06:53.075 Attached to 0000:d8:00.0 00:06:53.075 Cleaning up... 00:06:53.075 00:06:53.075 real 0m4.763s 00:06:53.075 user 0m3.589s 00:06:53.075 sys 0m0.420s 00:06:53.075 12:48:56 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.075 12:48:56 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:53.075 ************************************ 00:06:53.075 END TEST env_dpdk_post_init 00:06:53.075 ************************************ 00:06:53.075 12:48:56 env -- env/env.sh@26 -- # uname 00:06:53.075 12:48:56 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:53.075 12:48:56 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:53.075 12:48:56 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.075 12:48:56 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.075 12:48:56 env -- common/autotest_common.sh@10 -- # set +x 00:06:53.075 ************************************ 00:06:53.075 START TEST env_mem_callbacks 00:06:53.075 ************************************ 00:06:53.075 12:48:56 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:53.075 EAL: Detected CPU lcores: 112 00:06:53.075 EAL: Detected NUMA nodes: 2 00:06:53.075 EAL: Detected static linkage of DPDK 00:06:53.075 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:53.075 EAL: Selected IOVA mode 'VA' 00:06:53.075 EAL: VFIO support initialized 00:06:53.075 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:53.075 00:06:53.075 00:06:53.075 CUnit - A unit testing framework for C - Version 2.1-3 00:06:53.075 http://cunit.sourceforge.net/ 00:06:53.075 00:06:53.075 00:06:53.075 Suite: memory 00:06:53.075 Test: test ... 00:06:53.075 register 0x200000200000 2097152 00:06:53.075 malloc 3145728 00:06:53.075 register 0x200000400000 4194304 00:06:53.075 buf 0x200000500000 len 3145728 PASSED 00:06:53.075 malloc 64 00:06:53.075 buf 0x2000004fff40 len 64 PASSED 00:06:53.075 malloc 4194304 00:06:53.075 register 0x200000800000 6291456 00:06:53.075 buf 0x200000a00000 len 4194304 PASSED 00:06:53.075 free 0x200000500000 3145728 00:06:53.075 free 0x2000004fff40 64 00:06:53.075 unregister 0x200000400000 4194304 PASSED 00:06:53.075 free 0x200000a00000 4194304 00:06:53.075 unregister 0x200000800000 6291456 PASSED 00:06:53.075 malloc 8388608 00:06:53.075 register 0x200000400000 10485760 00:06:53.075 buf 0x200000600000 len 8388608 PASSED 00:06:53.075 free 0x200000600000 8388608 00:06:53.075 unregister 0x200000400000 10485760 PASSED 00:06:53.075 passed 00:06:53.075 00:06:53.075 Run Summary: Type Total Ran Passed Failed Inactive 00:06:53.075 suites 1 1 n/a 0 0 00:06:53.075 tests 1 1 1 0 0 00:06:53.075 asserts 15 15 15 0 n/a 00:06:53.075 00:06:53.075 Elapsed time = 0.008 seconds 00:06:53.075 00:06:53.075 real 0m0.070s 00:06:53.075 user 0m0.015s 00:06:53.075 sys 0m0.054s 00:06:53.075 12:48:56 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.075 12:48:56 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:53.075 ************************************ 00:06:53.075 END TEST env_mem_callbacks 00:06:53.075 ************************************ 00:06:53.075 00:06:53.075 real 0m6.696s 00:06:53.075 user 0m4.568s 00:06:53.075 sys 0m1.396s 00:06:53.075 12:48:56 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.075 12:48:56 env -- common/autotest_common.sh@10 -- # set +x 00:06:53.075 ************************************ 00:06:53.075 END TEST env 00:06:53.075 ************************************ 00:06:53.075 12:48:56 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:53.075 12:48:56 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.075 12:48:56 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.075 12:48:56 -- common/autotest_common.sh@10 -- # set +x 00:06:53.075 ************************************ 00:06:53.075 START TEST rpc 00:06:53.075 ************************************ 00:06:53.075 12:48:56 rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:53.336 * Looking for test storage... 00:06:53.336 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:53.336 12:48:56 rpc -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@1711 -- # lcov --version 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:06:53.337 12:48:56 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:53.337 12:48:56 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:53.337 12:48:56 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:53.337 12:48:56 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:53.337 12:48:56 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:53.337 12:48:56 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:53.337 12:48:56 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:53.337 12:48:56 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:53.337 12:48:56 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:53.337 12:48:56 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:53.337 12:48:56 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:53.337 12:48:56 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:53.337 12:48:56 rpc -- scripts/common.sh@345 -- # : 1 00:06:53.337 12:48:56 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:53.337 12:48:56 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:53.337 12:48:56 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:53.337 12:48:56 rpc -- scripts/common.sh@353 -- # local d=1 00:06:53.337 12:48:56 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:53.337 12:48:56 rpc -- scripts/common.sh@355 -- # echo 1 00:06:53.337 12:48:56 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:53.337 12:48:56 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:53.337 12:48:56 rpc -- scripts/common.sh@353 -- # local d=2 00:06:53.337 12:48:56 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:53.337 12:48:56 rpc -- scripts/common.sh@355 -- # echo 2 00:06:53.337 12:48:56 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:53.337 12:48:56 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:53.337 12:48:56 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:53.337 12:48:56 rpc -- scripts/common.sh@368 -- # return 0 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:06:53.337 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.337 --rc genhtml_branch_coverage=1 00:06:53.337 --rc genhtml_function_coverage=1 00:06:53.337 --rc genhtml_legend=1 00:06:53.337 --rc geninfo_all_blocks=1 00:06:53.337 --rc geninfo_unexecuted_blocks=1 00:06:53.337 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.337 ' 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:06:53.337 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.337 --rc genhtml_branch_coverage=1 00:06:53.337 --rc genhtml_function_coverage=1 00:06:53.337 --rc genhtml_legend=1 00:06:53.337 --rc geninfo_all_blocks=1 00:06:53.337 --rc geninfo_unexecuted_blocks=1 00:06:53.337 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.337 ' 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:06:53.337 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.337 --rc genhtml_branch_coverage=1 00:06:53.337 --rc genhtml_function_coverage=1 00:06:53.337 --rc genhtml_legend=1 00:06:53.337 --rc geninfo_all_blocks=1 00:06:53.337 --rc geninfo_unexecuted_blocks=1 00:06:53.337 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.337 ' 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:06:53.337 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.337 --rc genhtml_branch_coverage=1 00:06:53.337 --rc genhtml_function_coverage=1 00:06:53.337 --rc genhtml_legend=1 00:06:53.337 --rc geninfo_all_blocks=1 00:06:53.337 --rc geninfo_unexecuted_blocks=1 00:06:53.337 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.337 ' 00:06:53.337 12:48:56 rpc -- rpc/rpc.sh@65 -- # spdk_pid=135434 00:06:53.337 12:48:56 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:53.337 12:48:56 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:53.337 12:48:56 rpc -- rpc/rpc.sh@67 -- # waitforlisten 135434 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@835 -- # '[' -z 135434 ']' 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.337 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:53.337 12:48:56 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.337 [2024-12-05 12:48:56.533997] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:06:53.337 [2024-12-05 12:48:56.534078] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid135434 ] 00:06:53.337 [2024-12-05 12:48:56.619243] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.337 [2024-12-05 12:48:56.640462] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:53.337 [2024-12-05 12:48:56.640501] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 135434' to capture a snapshot of events at runtime. 00:06:53.337 [2024-12-05 12:48:56.640510] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:53.337 [2024-12-05 12:48:56.640518] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:53.337 [2024-12-05 12:48:56.640525] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid135434 for offline analysis/debug. 00:06:53.337 [2024-12-05 12:48:56.641150] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.597 12:48:56 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:53.597 12:48:56 rpc -- common/autotest_common.sh@868 -- # return 0 00:06:53.597 12:48:56 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:53.597 12:48:56 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:53.597 12:48:56 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:53.597 12:48:56 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:53.597 12:48:56 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.597 12:48:56 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.597 12:48:56 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.597 ************************************ 00:06:53.597 START TEST rpc_integrity 00:06:53.597 ************************************ 00:06:53.597 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:53.597 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:53.597 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.597 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:53.597 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.597 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:53.597 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:53.858 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:53.858 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:53.858 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.858 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:53.858 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.858 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:53.858 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:53.858 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.858 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:53.858 12:48:56 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.858 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:53.858 { 00:06:53.858 "name": "Malloc0", 00:06:53.858 "aliases": [ 00:06:53.858 "64da41ef-47a1-4099-a46b-6475449c8ebf" 00:06:53.858 ], 00:06:53.858 "product_name": "Malloc disk", 00:06:53.858 "block_size": 512, 00:06:53.858 "num_blocks": 16384, 00:06:53.858 "uuid": "64da41ef-47a1-4099-a46b-6475449c8ebf", 00:06:53.858 "assigned_rate_limits": { 00:06:53.858 "rw_ios_per_sec": 0, 00:06:53.858 "rw_mbytes_per_sec": 0, 00:06:53.858 "r_mbytes_per_sec": 0, 00:06:53.858 "w_mbytes_per_sec": 0 00:06:53.858 }, 00:06:53.858 "claimed": false, 00:06:53.858 "zoned": false, 00:06:53.858 "supported_io_types": { 00:06:53.858 "read": true, 00:06:53.858 "write": true, 00:06:53.858 "unmap": true, 00:06:53.858 "flush": true, 00:06:53.858 "reset": true, 00:06:53.858 "nvme_admin": false, 00:06:53.858 "nvme_io": false, 00:06:53.858 "nvme_io_md": false, 00:06:53.858 "write_zeroes": true, 00:06:53.858 "zcopy": true, 00:06:53.858 "get_zone_info": false, 00:06:53.858 "zone_management": false, 00:06:53.858 "zone_append": false, 00:06:53.858 "compare": false, 00:06:53.858 "compare_and_write": false, 00:06:53.858 "abort": true, 00:06:53.858 "seek_hole": false, 00:06:53.858 "seek_data": false, 00:06:53.858 "copy": true, 00:06:53.858 "nvme_iov_md": false 00:06:53.858 }, 00:06:53.858 "memory_domains": [ 00:06:53.858 { 00:06:53.858 "dma_device_id": "system", 00:06:53.858 "dma_device_type": 1 00:06:53.858 }, 00:06:53.858 { 00:06:53.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:53.858 "dma_device_type": 2 00:06:53.858 } 00:06:53.858 ], 00:06:53.858 "driver_specific": {} 00:06:53.858 } 00:06:53.858 ]' 00:06:53.858 12:48:56 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:53.858 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:53.858 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:53.858 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:53.859 [2024-12-05 12:48:57.038319] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:53.859 [2024-12-05 12:48:57.038351] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:53.859 [2024-12-05 12:48:57.038368] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x62c5740 00:06:53.859 [2024-12-05 12:48:57.038378] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:53.859 [2024-12-05 12:48:57.039270] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:53.859 [2024-12-05 12:48:57.039295] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:53.859 Passthru0 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:53.859 { 00:06:53.859 "name": "Malloc0", 00:06:53.859 "aliases": [ 00:06:53.859 "64da41ef-47a1-4099-a46b-6475449c8ebf" 00:06:53.859 ], 00:06:53.859 "product_name": "Malloc disk", 00:06:53.859 "block_size": 512, 00:06:53.859 "num_blocks": 16384, 00:06:53.859 "uuid": "64da41ef-47a1-4099-a46b-6475449c8ebf", 00:06:53.859 "assigned_rate_limits": { 00:06:53.859 "rw_ios_per_sec": 0, 00:06:53.859 "rw_mbytes_per_sec": 0, 00:06:53.859 "r_mbytes_per_sec": 0, 00:06:53.859 "w_mbytes_per_sec": 0 00:06:53.859 }, 00:06:53.859 "claimed": true, 00:06:53.859 "claim_type": "exclusive_write", 00:06:53.859 "zoned": false, 00:06:53.859 "supported_io_types": { 00:06:53.859 "read": true, 00:06:53.859 "write": true, 00:06:53.859 "unmap": true, 00:06:53.859 "flush": true, 00:06:53.859 "reset": true, 00:06:53.859 "nvme_admin": false, 00:06:53.859 "nvme_io": false, 00:06:53.859 "nvme_io_md": false, 00:06:53.859 "write_zeroes": true, 00:06:53.859 "zcopy": true, 00:06:53.859 "get_zone_info": false, 00:06:53.859 "zone_management": false, 00:06:53.859 "zone_append": false, 00:06:53.859 "compare": false, 00:06:53.859 "compare_and_write": false, 00:06:53.859 "abort": true, 00:06:53.859 "seek_hole": false, 00:06:53.859 "seek_data": false, 00:06:53.859 "copy": true, 00:06:53.859 "nvme_iov_md": false 00:06:53.859 }, 00:06:53.859 "memory_domains": [ 00:06:53.859 { 00:06:53.859 "dma_device_id": "system", 00:06:53.859 "dma_device_type": 1 00:06:53.859 }, 00:06:53.859 { 00:06:53.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:53.859 "dma_device_type": 2 00:06:53.859 } 00:06:53.859 ], 00:06:53.859 "driver_specific": {} 00:06:53.859 }, 00:06:53.859 { 00:06:53.859 "name": "Passthru0", 00:06:53.859 "aliases": [ 00:06:53.859 "6151592e-3f3c-5df9-bd3d-d2304a66c68f" 00:06:53.859 ], 00:06:53.859 "product_name": "passthru", 00:06:53.859 "block_size": 512, 00:06:53.859 "num_blocks": 16384, 00:06:53.859 "uuid": "6151592e-3f3c-5df9-bd3d-d2304a66c68f", 00:06:53.859 "assigned_rate_limits": { 00:06:53.859 "rw_ios_per_sec": 0, 00:06:53.859 "rw_mbytes_per_sec": 0, 00:06:53.859 "r_mbytes_per_sec": 0, 00:06:53.859 "w_mbytes_per_sec": 0 00:06:53.859 }, 00:06:53.859 "claimed": false, 00:06:53.859 "zoned": false, 00:06:53.859 "supported_io_types": { 00:06:53.859 "read": true, 00:06:53.859 "write": true, 00:06:53.859 "unmap": true, 00:06:53.859 "flush": true, 00:06:53.859 "reset": true, 00:06:53.859 "nvme_admin": false, 00:06:53.859 "nvme_io": false, 00:06:53.859 "nvme_io_md": false, 00:06:53.859 "write_zeroes": true, 00:06:53.859 "zcopy": true, 00:06:53.859 "get_zone_info": false, 00:06:53.859 "zone_management": false, 00:06:53.859 "zone_append": false, 00:06:53.859 "compare": false, 00:06:53.859 "compare_and_write": false, 00:06:53.859 "abort": true, 00:06:53.859 "seek_hole": false, 00:06:53.859 "seek_data": false, 00:06:53.859 "copy": true, 00:06:53.859 "nvme_iov_md": false 00:06:53.859 }, 00:06:53.859 "memory_domains": [ 00:06:53.859 { 00:06:53.859 "dma_device_id": "system", 00:06:53.859 "dma_device_type": 1 00:06:53.859 }, 00:06:53.859 { 00:06:53.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:53.859 "dma_device_type": 2 00:06:53.859 } 00:06:53.859 ], 00:06:53.859 "driver_specific": { 00:06:53.859 "passthru": { 00:06:53.859 "name": "Passthru0", 00:06:53.859 "base_bdev_name": "Malloc0" 00:06:53.859 } 00:06:53.859 } 00:06:53.859 } 00:06:53.859 ]' 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:53.859 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:53.859 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:54.119 12:48:57 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:54.119 00:06:54.119 real 0m0.301s 00:06:54.119 user 0m0.181s 00:06:54.119 sys 0m0.057s 00:06:54.119 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:54.120 12:48:57 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.120 ************************************ 00:06:54.120 END TEST rpc_integrity 00:06:54.120 ************************************ 00:06:54.120 12:48:57 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:54.120 12:48:57 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:54.120 12:48:57 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:54.120 12:48:57 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:54.120 ************************************ 00:06:54.120 START TEST rpc_plugins 00:06:54.120 ************************************ 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:54.120 { 00:06:54.120 "name": "Malloc1", 00:06:54.120 "aliases": [ 00:06:54.120 "cbcdd619-772f-4fcd-beee-fe0f92093e2e" 00:06:54.120 ], 00:06:54.120 "product_name": "Malloc disk", 00:06:54.120 "block_size": 4096, 00:06:54.120 "num_blocks": 256, 00:06:54.120 "uuid": "cbcdd619-772f-4fcd-beee-fe0f92093e2e", 00:06:54.120 "assigned_rate_limits": { 00:06:54.120 "rw_ios_per_sec": 0, 00:06:54.120 "rw_mbytes_per_sec": 0, 00:06:54.120 "r_mbytes_per_sec": 0, 00:06:54.120 "w_mbytes_per_sec": 0 00:06:54.120 }, 00:06:54.120 "claimed": false, 00:06:54.120 "zoned": false, 00:06:54.120 "supported_io_types": { 00:06:54.120 "read": true, 00:06:54.120 "write": true, 00:06:54.120 "unmap": true, 00:06:54.120 "flush": true, 00:06:54.120 "reset": true, 00:06:54.120 "nvme_admin": false, 00:06:54.120 "nvme_io": false, 00:06:54.120 "nvme_io_md": false, 00:06:54.120 "write_zeroes": true, 00:06:54.120 "zcopy": true, 00:06:54.120 "get_zone_info": false, 00:06:54.120 "zone_management": false, 00:06:54.120 "zone_append": false, 00:06:54.120 "compare": false, 00:06:54.120 "compare_and_write": false, 00:06:54.120 "abort": true, 00:06:54.120 "seek_hole": false, 00:06:54.120 "seek_data": false, 00:06:54.120 "copy": true, 00:06:54.120 "nvme_iov_md": false 00:06:54.120 }, 00:06:54.120 "memory_domains": [ 00:06:54.120 { 00:06:54.120 "dma_device_id": "system", 00:06:54.120 "dma_device_type": 1 00:06:54.120 }, 00:06:54.120 { 00:06:54.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:54.120 "dma_device_type": 2 00:06:54.120 } 00:06:54.120 ], 00:06:54.120 "driver_specific": {} 00:06:54.120 } 00:06:54.120 ]' 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:54.120 12:48:57 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:54.120 00:06:54.120 real 0m0.147s 00:06:54.120 user 0m0.095s 00:06:54.120 sys 0m0.020s 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:54.120 12:48:57 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:54.120 ************************************ 00:06:54.120 END TEST rpc_plugins 00:06:54.120 ************************************ 00:06:54.380 12:48:57 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:54.380 12:48:57 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:54.380 12:48:57 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:54.380 12:48:57 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:54.380 ************************************ 00:06:54.380 START TEST rpc_trace_cmd_test 00:06:54.380 ************************************ 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:54.380 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid135434", 00:06:54.380 "tpoint_group_mask": "0x8", 00:06:54.380 "iscsi_conn": { 00:06:54.380 "mask": "0x2", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "scsi": { 00:06:54.380 "mask": "0x4", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "bdev": { 00:06:54.380 "mask": "0x8", 00:06:54.380 "tpoint_mask": "0xffffffffffffffff" 00:06:54.380 }, 00:06:54.380 "nvmf_rdma": { 00:06:54.380 "mask": "0x10", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "nvmf_tcp": { 00:06:54.380 "mask": "0x20", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "ftl": { 00:06:54.380 "mask": "0x40", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "blobfs": { 00:06:54.380 "mask": "0x80", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "dsa": { 00:06:54.380 "mask": "0x200", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "thread": { 00:06:54.380 "mask": "0x400", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "nvme_pcie": { 00:06:54.380 "mask": "0x800", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "iaa": { 00:06:54.380 "mask": "0x1000", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "nvme_tcp": { 00:06:54.380 "mask": "0x2000", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "bdev_nvme": { 00:06:54.380 "mask": "0x4000", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "sock": { 00:06:54.380 "mask": "0x8000", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "blob": { 00:06:54.380 "mask": "0x10000", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "bdev_raid": { 00:06:54.380 "mask": "0x20000", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 }, 00:06:54.380 "scheduler": { 00:06:54.380 "mask": "0x40000", 00:06:54.380 "tpoint_mask": "0x0" 00:06:54.380 } 00:06:54.380 }' 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:54.380 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:54.640 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:54.640 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:54.640 12:48:57 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:54.640 00:06:54.640 real 0m0.226s 00:06:54.640 user 0m0.176s 00:06:54.640 sys 0m0.043s 00:06:54.640 12:48:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:54.640 12:48:57 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:54.640 ************************************ 00:06:54.640 END TEST rpc_trace_cmd_test 00:06:54.640 ************************************ 00:06:54.640 12:48:57 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:54.640 12:48:57 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:54.640 12:48:57 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:54.640 12:48:57 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:54.640 12:48:57 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:54.640 12:48:57 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:54.640 ************************************ 00:06:54.640 START TEST rpc_daemon_integrity 00:06:54.640 ************************************ 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:54.640 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:54.641 { 00:06:54.641 "name": "Malloc2", 00:06:54.641 "aliases": [ 00:06:54.641 "74383e64-45f1-4421-9d5e-8dbe612e9b73" 00:06:54.641 ], 00:06:54.641 "product_name": "Malloc disk", 00:06:54.641 "block_size": 512, 00:06:54.641 "num_blocks": 16384, 00:06:54.641 "uuid": "74383e64-45f1-4421-9d5e-8dbe612e9b73", 00:06:54.641 "assigned_rate_limits": { 00:06:54.641 "rw_ios_per_sec": 0, 00:06:54.641 "rw_mbytes_per_sec": 0, 00:06:54.641 "r_mbytes_per_sec": 0, 00:06:54.641 "w_mbytes_per_sec": 0 00:06:54.641 }, 00:06:54.641 "claimed": false, 00:06:54.641 "zoned": false, 00:06:54.641 "supported_io_types": { 00:06:54.641 "read": true, 00:06:54.641 "write": true, 00:06:54.641 "unmap": true, 00:06:54.641 "flush": true, 00:06:54.641 "reset": true, 00:06:54.641 "nvme_admin": false, 00:06:54.641 "nvme_io": false, 00:06:54.641 "nvme_io_md": false, 00:06:54.641 "write_zeroes": true, 00:06:54.641 "zcopy": true, 00:06:54.641 "get_zone_info": false, 00:06:54.641 "zone_management": false, 00:06:54.641 "zone_append": false, 00:06:54.641 "compare": false, 00:06:54.641 "compare_and_write": false, 00:06:54.641 "abort": true, 00:06:54.641 "seek_hole": false, 00:06:54.641 "seek_data": false, 00:06:54.641 "copy": true, 00:06:54.641 "nvme_iov_md": false 00:06:54.641 }, 00:06:54.641 "memory_domains": [ 00:06:54.641 { 00:06:54.641 "dma_device_id": "system", 00:06:54.641 "dma_device_type": 1 00:06:54.641 }, 00:06:54.641 { 00:06:54.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:54.641 "dma_device_type": 2 00:06:54.641 } 00:06:54.641 ], 00:06:54.641 "driver_specific": {} 00:06:54.641 } 00:06:54.641 ]' 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.641 [2024-12-05 12:48:57.948651] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:54.641 [2024-12-05 12:48:57.948682] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:54.641 [2024-12-05 12:48:57.948703] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x62bb8d0 00:06:54.641 [2024-12-05 12:48:57.948716] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:54.641 [2024-12-05 12:48:57.949470] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:54.641 [2024-12-05 12:48:57.949493] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:54.641 Passthru0 00:06:54.641 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.901 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:54.901 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.901 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.901 12:48:57 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.901 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:54.901 { 00:06:54.901 "name": "Malloc2", 00:06:54.901 "aliases": [ 00:06:54.901 "74383e64-45f1-4421-9d5e-8dbe612e9b73" 00:06:54.901 ], 00:06:54.901 "product_name": "Malloc disk", 00:06:54.901 "block_size": 512, 00:06:54.901 "num_blocks": 16384, 00:06:54.901 "uuid": "74383e64-45f1-4421-9d5e-8dbe612e9b73", 00:06:54.901 "assigned_rate_limits": { 00:06:54.901 "rw_ios_per_sec": 0, 00:06:54.901 "rw_mbytes_per_sec": 0, 00:06:54.901 "r_mbytes_per_sec": 0, 00:06:54.901 "w_mbytes_per_sec": 0 00:06:54.901 }, 00:06:54.901 "claimed": true, 00:06:54.901 "claim_type": "exclusive_write", 00:06:54.901 "zoned": false, 00:06:54.901 "supported_io_types": { 00:06:54.901 "read": true, 00:06:54.901 "write": true, 00:06:54.901 "unmap": true, 00:06:54.901 "flush": true, 00:06:54.901 "reset": true, 00:06:54.901 "nvme_admin": false, 00:06:54.901 "nvme_io": false, 00:06:54.901 "nvme_io_md": false, 00:06:54.901 "write_zeroes": true, 00:06:54.901 "zcopy": true, 00:06:54.901 "get_zone_info": false, 00:06:54.901 "zone_management": false, 00:06:54.901 "zone_append": false, 00:06:54.901 "compare": false, 00:06:54.901 "compare_and_write": false, 00:06:54.901 "abort": true, 00:06:54.901 "seek_hole": false, 00:06:54.901 "seek_data": false, 00:06:54.902 "copy": true, 00:06:54.902 "nvme_iov_md": false 00:06:54.902 }, 00:06:54.902 "memory_domains": [ 00:06:54.902 { 00:06:54.902 "dma_device_id": "system", 00:06:54.902 "dma_device_type": 1 00:06:54.902 }, 00:06:54.902 { 00:06:54.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:54.902 "dma_device_type": 2 00:06:54.902 } 00:06:54.902 ], 00:06:54.902 "driver_specific": {} 00:06:54.902 }, 00:06:54.902 { 00:06:54.902 "name": "Passthru0", 00:06:54.902 "aliases": [ 00:06:54.902 "dcdd0008-b238-5a25-82a1-fe12d96866b3" 00:06:54.902 ], 00:06:54.902 "product_name": "passthru", 00:06:54.902 "block_size": 512, 00:06:54.902 "num_blocks": 16384, 00:06:54.902 "uuid": "dcdd0008-b238-5a25-82a1-fe12d96866b3", 00:06:54.902 "assigned_rate_limits": { 00:06:54.902 "rw_ios_per_sec": 0, 00:06:54.902 "rw_mbytes_per_sec": 0, 00:06:54.902 "r_mbytes_per_sec": 0, 00:06:54.902 "w_mbytes_per_sec": 0 00:06:54.902 }, 00:06:54.902 "claimed": false, 00:06:54.902 "zoned": false, 00:06:54.902 "supported_io_types": { 00:06:54.902 "read": true, 00:06:54.902 "write": true, 00:06:54.902 "unmap": true, 00:06:54.902 "flush": true, 00:06:54.902 "reset": true, 00:06:54.902 "nvme_admin": false, 00:06:54.902 "nvme_io": false, 00:06:54.902 "nvme_io_md": false, 00:06:54.902 "write_zeroes": true, 00:06:54.902 "zcopy": true, 00:06:54.902 "get_zone_info": false, 00:06:54.902 "zone_management": false, 00:06:54.902 "zone_append": false, 00:06:54.902 "compare": false, 00:06:54.902 "compare_and_write": false, 00:06:54.902 "abort": true, 00:06:54.902 "seek_hole": false, 00:06:54.902 "seek_data": false, 00:06:54.902 "copy": true, 00:06:54.902 "nvme_iov_md": false 00:06:54.902 }, 00:06:54.902 "memory_domains": [ 00:06:54.902 { 00:06:54.902 "dma_device_id": "system", 00:06:54.902 "dma_device_type": 1 00:06:54.902 }, 00:06:54.902 { 00:06:54.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:54.902 "dma_device_type": 2 00:06:54.902 } 00:06:54.902 ], 00:06:54.902 "driver_specific": { 00:06:54.902 "passthru": { 00:06:54.902 "name": "Passthru0", 00:06:54.902 "base_bdev_name": "Malloc2" 00:06:54.902 } 00:06:54.902 } 00:06:54.902 } 00:06:54.902 ]' 00:06:54.902 12:48:57 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:54.902 00:06:54.902 real 0m0.292s 00:06:54.902 user 0m0.173s 00:06:54.902 sys 0m0.056s 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:54.902 12:48:58 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:54.902 ************************************ 00:06:54.902 END TEST rpc_daemon_integrity 00:06:54.902 ************************************ 00:06:54.902 12:48:58 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:54.902 12:48:58 rpc -- rpc/rpc.sh@84 -- # killprocess 135434 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@954 -- # '[' -z 135434 ']' 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@958 -- # kill -0 135434 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@959 -- # uname 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 135434 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 135434' 00:06:54.902 killing process with pid 135434 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@973 -- # kill 135434 00:06:54.902 12:48:58 rpc -- common/autotest_common.sh@978 -- # wait 135434 00:06:55.472 00:06:55.472 real 0m2.182s 00:06:55.472 user 0m2.735s 00:06:55.472 sys 0m0.862s 00:06:55.472 12:48:58 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:55.472 12:48:58 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:55.472 ************************************ 00:06:55.472 END TEST rpc 00:06:55.472 ************************************ 00:06:55.472 12:48:58 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:55.472 12:48:58 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:55.472 12:48:58 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:55.472 12:48:58 -- common/autotest_common.sh@10 -- # set +x 00:06:55.472 ************************************ 00:06:55.472 START TEST skip_rpc 00:06:55.472 ************************************ 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:55.472 * Looking for test storage... 00:06:55.472 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1711 -- # lcov --version 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:55.472 12:48:58 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:06:55.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.472 --rc genhtml_branch_coverage=1 00:06:55.472 --rc genhtml_function_coverage=1 00:06:55.472 --rc genhtml_legend=1 00:06:55.472 --rc geninfo_all_blocks=1 00:06:55.472 --rc geninfo_unexecuted_blocks=1 00:06:55.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.472 ' 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:06:55.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.472 --rc genhtml_branch_coverage=1 00:06:55.472 --rc genhtml_function_coverage=1 00:06:55.472 --rc genhtml_legend=1 00:06:55.472 --rc geninfo_all_blocks=1 00:06:55.472 --rc geninfo_unexecuted_blocks=1 00:06:55.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.472 ' 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:06:55.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.472 --rc genhtml_branch_coverage=1 00:06:55.472 --rc genhtml_function_coverage=1 00:06:55.472 --rc genhtml_legend=1 00:06:55.472 --rc geninfo_all_blocks=1 00:06:55.472 --rc geninfo_unexecuted_blocks=1 00:06:55.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.472 ' 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:06:55.472 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.472 --rc genhtml_branch_coverage=1 00:06:55.472 --rc genhtml_function_coverage=1 00:06:55.472 --rc genhtml_legend=1 00:06:55.472 --rc geninfo_all_blocks=1 00:06:55.472 --rc geninfo_unexecuted_blocks=1 00:06:55.472 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.472 ' 00:06:55.472 12:48:58 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:55.472 12:48:58 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:55.472 12:48:58 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:55.472 12:48:58 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:55.733 ************************************ 00:06:55.733 START TEST skip_rpc 00:06:55.733 ************************************ 00:06:55.733 12:48:58 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:06:55.733 12:48:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=135890 00:06:55.733 12:48:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:55.733 12:48:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:55.733 12:48:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:55.733 [2024-12-05 12:48:58.825443] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:06:55.733 [2024-12-05 12:48:58.825525] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid135890 ] 00:06:55.733 [2024-12-05 12:48:58.913247] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.733 [2024-12-05 12:48:58.935386] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 135890 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 135890 ']' 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 135890 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 135890 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 135890' 00:07:01.015 killing process with pid 135890 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 135890 00:07:01.015 12:49:03 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 135890 00:07:01.015 00:07:01.015 real 0m5.366s 00:07:01.015 user 0m5.120s 00:07:01.015 sys 0m0.301s 00:07:01.015 12:49:04 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:01.015 12:49:04 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:01.015 ************************************ 00:07:01.015 END TEST skip_rpc 00:07:01.015 ************************************ 00:07:01.015 12:49:04 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:07:01.015 12:49:04 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:01.015 12:49:04 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:01.015 12:49:04 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:01.015 ************************************ 00:07:01.015 START TEST skip_rpc_with_json 00:07:01.015 ************************************ 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=136978 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 136978 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 136978 ']' 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:01.015 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:01.015 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:01.015 [2024-12-05 12:49:04.277649] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:01.015 [2024-12-05 12:49:04.277734] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid136978 ] 00:07:01.275 [2024-12-05 12:49:04.364071] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.275 [2024-12-05 12:49:04.386241] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.275 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:01.275 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:07:01.275 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:07:01.275 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:01.275 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:01.275 [2024-12-05 12:49:04.583546] nvmf_rpc.c:2707:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:07:01.275 request: 00:07:01.275 { 00:07:01.275 "trtype": "tcp", 00:07:01.275 "method": "nvmf_get_transports", 00:07:01.275 "req_id": 1 00:07:01.275 } 00:07:01.275 Got JSON-RPC error response 00:07:01.275 response: 00:07:01.275 { 00:07:01.275 "code": -19, 00:07:01.535 "message": "No such device" 00:07:01.535 } 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:01.535 [2024-12-05 12:49:04.595637] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:01.535 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:01.535 { 00:07:01.535 "subsystems": [ 00:07:01.535 { 00:07:01.535 "subsystem": "scheduler", 00:07:01.535 "config": [ 00:07:01.535 { 00:07:01.535 "method": "framework_set_scheduler", 00:07:01.535 "params": { 00:07:01.535 "name": "static" 00:07:01.535 } 00:07:01.535 } 00:07:01.535 ] 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "vmd", 00:07:01.535 "config": [] 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "sock", 00:07:01.535 "config": [ 00:07:01.535 { 00:07:01.535 "method": "sock_set_default_impl", 00:07:01.535 "params": { 00:07:01.535 "impl_name": "posix" 00:07:01.535 } 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "method": "sock_impl_set_options", 00:07:01.535 "params": { 00:07:01.535 "impl_name": "ssl", 00:07:01.535 "recv_buf_size": 4096, 00:07:01.535 "send_buf_size": 4096, 00:07:01.535 "enable_recv_pipe": true, 00:07:01.535 "enable_quickack": false, 00:07:01.535 "enable_placement_id": 0, 00:07:01.535 "enable_zerocopy_send_server": true, 00:07:01.535 "enable_zerocopy_send_client": false, 00:07:01.535 "zerocopy_threshold": 0, 00:07:01.535 "tls_version": 0, 00:07:01.535 "enable_ktls": false 00:07:01.535 } 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "method": "sock_impl_set_options", 00:07:01.535 "params": { 00:07:01.535 "impl_name": "posix", 00:07:01.535 "recv_buf_size": 2097152, 00:07:01.535 "send_buf_size": 2097152, 00:07:01.535 "enable_recv_pipe": true, 00:07:01.535 "enable_quickack": false, 00:07:01.535 "enable_placement_id": 0, 00:07:01.535 "enable_zerocopy_send_server": true, 00:07:01.535 "enable_zerocopy_send_client": false, 00:07:01.535 "zerocopy_threshold": 0, 00:07:01.535 "tls_version": 0, 00:07:01.535 "enable_ktls": false 00:07:01.535 } 00:07:01.535 } 00:07:01.535 ] 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "iobuf", 00:07:01.535 "config": [ 00:07:01.535 { 00:07:01.535 "method": "iobuf_set_options", 00:07:01.535 "params": { 00:07:01.535 "small_pool_count": 8192, 00:07:01.535 "large_pool_count": 1024, 00:07:01.535 "small_bufsize": 8192, 00:07:01.535 "large_bufsize": 135168, 00:07:01.535 "enable_numa": false 00:07:01.535 } 00:07:01.535 } 00:07:01.535 ] 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "keyring", 00:07:01.535 "config": [] 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "vfio_user_target", 00:07:01.535 "config": null 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "fsdev", 00:07:01.535 "config": [ 00:07:01.535 { 00:07:01.535 "method": "fsdev_set_opts", 00:07:01.535 "params": { 00:07:01.535 "fsdev_io_pool_size": 65535, 00:07:01.535 "fsdev_io_cache_size": 256 00:07:01.535 } 00:07:01.535 } 00:07:01.535 ] 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "accel", 00:07:01.535 "config": [ 00:07:01.535 { 00:07:01.535 "method": "accel_set_options", 00:07:01.535 "params": { 00:07:01.535 "small_cache_size": 128, 00:07:01.535 "large_cache_size": 16, 00:07:01.535 "task_count": 2048, 00:07:01.535 "sequence_count": 2048, 00:07:01.535 "buf_count": 2048 00:07:01.535 } 00:07:01.535 } 00:07:01.535 ] 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "bdev", 00:07:01.535 "config": [ 00:07:01.535 { 00:07:01.535 "method": "bdev_set_options", 00:07:01.535 "params": { 00:07:01.535 "bdev_io_pool_size": 65535, 00:07:01.535 "bdev_io_cache_size": 256, 00:07:01.535 "bdev_auto_examine": true, 00:07:01.535 "iobuf_small_cache_size": 128, 00:07:01.535 "iobuf_large_cache_size": 16 00:07:01.535 } 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "method": "bdev_raid_set_options", 00:07:01.535 "params": { 00:07:01.535 "process_window_size_kb": 1024, 00:07:01.535 "process_max_bandwidth_mb_sec": 0 00:07:01.535 } 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "method": "bdev_nvme_set_options", 00:07:01.535 "params": { 00:07:01.535 "action_on_timeout": "none", 00:07:01.535 "timeout_us": 0, 00:07:01.535 "timeout_admin_us": 0, 00:07:01.535 "keep_alive_timeout_ms": 10000, 00:07:01.535 "arbitration_burst": 0, 00:07:01.535 "low_priority_weight": 0, 00:07:01.535 "medium_priority_weight": 0, 00:07:01.535 "high_priority_weight": 0, 00:07:01.535 "nvme_adminq_poll_period_us": 10000, 00:07:01.535 "nvme_ioq_poll_period_us": 0, 00:07:01.535 "io_queue_requests": 0, 00:07:01.535 "delay_cmd_submit": true, 00:07:01.535 "transport_retry_count": 4, 00:07:01.535 "bdev_retry_count": 3, 00:07:01.535 "transport_ack_timeout": 0, 00:07:01.535 "ctrlr_loss_timeout_sec": 0, 00:07:01.535 "reconnect_delay_sec": 0, 00:07:01.535 "fast_io_fail_timeout_sec": 0, 00:07:01.535 "disable_auto_failback": false, 00:07:01.535 "generate_uuids": false, 00:07:01.535 "transport_tos": 0, 00:07:01.535 "nvme_error_stat": false, 00:07:01.535 "rdma_srq_size": 0, 00:07:01.535 "io_path_stat": false, 00:07:01.535 "allow_accel_sequence": false, 00:07:01.535 "rdma_max_cq_size": 0, 00:07:01.535 "rdma_cm_event_timeout_ms": 0, 00:07:01.535 "dhchap_digests": [ 00:07:01.535 "sha256", 00:07:01.535 "sha384", 00:07:01.535 "sha512" 00:07:01.535 ], 00:07:01.535 "dhchap_dhgroups": [ 00:07:01.535 "null", 00:07:01.535 "ffdhe2048", 00:07:01.535 "ffdhe3072", 00:07:01.535 "ffdhe4096", 00:07:01.535 "ffdhe6144", 00:07:01.535 "ffdhe8192" 00:07:01.535 ] 00:07:01.535 } 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "method": "bdev_nvme_set_hotplug", 00:07:01.535 "params": { 00:07:01.535 "period_us": 100000, 00:07:01.535 "enable": false 00:07:01.535 } 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "method": "bdev_iscsi_set_options", 00:07:01.535 "params": { 00:07:01.535 "timeout_sec": 30 00:07:01.535 } 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "method": "bdev_wait_for_examine" 00:07:01.535 } 00:07:01.535 ] 00:07:01.535 }, 00:07:01.535 { 00:07:01.535 "subsystem": "nvmf", 00:07:01.535 "config": [ 00:07:01.535 { 00:07:01.535 "method": "nvmf_set_config", 00:07:01.535 "params": { 00:07:01.535 "discovery_filter": "match_any", 00:07:01.535 "admin_cmd_passthru": { 00:07:01.535 "identify_ctrlr": false 00:07:01.535 }, 00:07:01.535 "dhchap_digests": [ 00:07:01.535 "sha256", 00:07:01.535 "sha384", 00:07:01.535 "sha512" 00:07:01.535 ], 00:07:01.535 "dhchap_dhgroups": [ 00:07:01.535 "null", 00:07:01.535 "ffdhe2048", 00:07:01.535 "ffdhe3072", 00:07:01.535 "ffdhe4096", 00:07:01.535 "ffdhe6144", 00:07:01.535 "ffdhe8192" 00:07:01.536 ] 00:07:01.536 } 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "method": "nvmf_set_max_subsystems", 00:07:01.536 "params": { 00:07:01.536 "max_subsystems": 1024 00:07:01.536 } 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "method": "nvmf_set_crdt", 00:07:01.536 "params": { 00:07:01.536 "crdt1": 0, 00:07:01.536 "crdt2": 0, 00:07:01.536 "crdt3": 0 00:07:01.536 } 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "method": "nvmf_create_transport", 00:07:01.536 "params": { 00:07:01.536 "trtype": "TCP", 00:07:01.536 "max_queue_depth": 128, 00:07:01.536 "max_io_qpairs_per_ctrlr": 127, 00:07:01.536 "in_capsule_data_size": 4096, 00:07:01.536 "max_io_size": 131072, 00:07:01.536 "io_unit_size": 131072, 00:07:01.536 "max_aq_depth": 128, 00:07:01.536 "num_shared_buffers": 511, 00:07:01.536 "buf_cache_size": 4294967295, 00:07:01.536 "dif_insert_or_strip": false, 00:07:01.536 "zcopy": false, 00:07:01.536 "c2h_success": true, 00:07:01.536 "sock_priority": 0, 00:07:01.536 "abort_timeout_sec": 1, 00:07:01.536 "ack_timeout": 0, 00:07:01.536 "data_wr_pool_size": 0 00:07:01.536 } 00:07:01.536 } 00:07:01.536 ] 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "subsystem": "nbd", 00:07:01.536 "config": [] 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "subsystem": "ublk", 00:07:01.536 "config": [] 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "subsystem": "vhost_blk", 00:07:01.536 "config": [] 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "subsystem": "scsi", 00:07:01.536 "config": null 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "subsystem": "iscsi", 00:07:01.536 "config": [ 00:07:01.536 { 00:07:01.536 "method": "iscsi_set_options", 00:07:01.536 "params": { 00:07:01.536 "node_base": "iqn.2016-06.io.spdk", 00:07:01.536 "max_sessions": 128, 00:07:01.536 "max_connections_per_session": 2, 00:07:01.536 "max_queue_depth": 64, 00:07:01.536 "default_time2wait": 2, 00:07:01.536 "default_time2retain": 20, 00:07:01.536 "first_burst_length": 8192, 00:07:01.536 "immediate_data": true, 00:07:01.536 "allow_duplicated_isid": false, 00:07:01.536 "error_recovery_level": 0, 00:07:01.536 "nop_timeout": 60, 00:07:01.536 "nop_in_interval": 30, 00:07:01.536 "disable_chap": false, 00:07:01.536 "require_chap": false, 00:07:01.536 "mutual_chap": false, 00:07:01.536 "chap_group": 0, 00:07:01.536 "max_large_datain_per_connection": 64, 00:07:01.536 "max_r2t_per_connection": 4, 00:07:01.536 "pdu_pool_size": 36864, 00:07:01.536 "immediate_data_pool_size": 16384, 00:07:01.536 "data_out_pool_size": 2048 00:07:01.536 } 00:07:01.536 } 00:07:01.536 ] 00:07:01.536 }, 00:07:01.536 { 00:07:01.536 "subsystem": "vhost_scsi", 00:07:01.536 "config": [] 00:07:01.536 } 00:07:01.536 ] 00:07:01.536 } 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 136978 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 136978 ']' 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 136978 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 136978 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 136978' 00:07:01.536 killing process with pid 136978 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 136978 00:07:01.536 12:49:04 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 136978 00:07:02.114 12:49:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=137020 00:07:02.114 12:49:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:07:02.114 12:49:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 137020 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 137020 ']' 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 137020 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 137020 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 137020' 00:07:07.400 killing process with pid 137020 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 137020 00:07:07.400 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 137020 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:07:07.401 00:07:07.401 real 0m6.249s 00:07:07.401 user 0m5.905s 00:07:07.401 sys 0m0.679s 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:07.401 ************************************ 00:07:07.401 END TEST skip_rpc_with_json 00:07:07.401 ************************************ 00:07:07.401 12:49:10 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:07.401 12:49:10 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:07.401 12:49:10 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:07.401 12:49:10 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:07.401 ************************************ 00:07:07.401 START TEST skip_rpc_with_delay 00:07:07.401 ************************************ 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:07.401 [2024-12-05 12:49:10.614826] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:07.401 00:07:07.401 real 0m0.046s 00:07:07.401 user 0m0.020s 00:07:07.401 sys 0m0.026s 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:07.401 12:49:10 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:07.401 ************************************ 00:07:07.401 END TEST skip_rpc_with_delay 00:07:07.401 ************************************ 00:07:07.401 12:49:10 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:07.401 12:49:10 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:07.401 12:49:10 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:07.401 12:49:10 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:07.401 12:49:10 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:07.401 12:49:10 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:07.661 ************************************ 00:07:07.661 START TEST exit_on_failed_rpc_init 00:07:07.661 ************************************ 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=138106 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 138106 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 138106 ']' 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:07.661 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:07.661 12:49:10 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:07.661 [2024-12-05 12:49:10.749095] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:07.661 [2024-12-05 12:49:10.749187] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138106 ] 00:07:07.661 [2024-12-05 12:49:10.836154] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.661 [2024-12-05 12:49:10.859937] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:07.922 [2024-12-05 12:49:11.090728] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:07.922 [2024-12-05 12:49:11.090821] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138119 ] 00:07:07.922 [2024-12-05 12:49:11.177174] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.922 [2024-12-05 12:49:11.198742] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:07.922 [2024-12-05 12:49:11.198812] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:07.922 [2024-12-05 12:49:11.198825] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:07.922 [2024-12-05 12:49:11.198837] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 138106 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 138106 ']' 00:07:07.922 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 138106 00:07:08.182 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:07:08.182 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:08.182 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 138106 00:07:08.182 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:08.182 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:08.182 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 138106' 00:07:08.182 killing process with pid 138106 00:07:08.182 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 138106 00:07:08.182 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 138106 00:07:08.442 00:07:08.442 real 0m0.858s 00:07:08.442 user 0m0.807s 00:07:08.442 sys 0m0.471s 00:07:08.442 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.442 12:49:11 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:08.442 ************************************ 00:07:08.442 END TEST exit_on_failed_rpc_init 00:07:08.442 ************************************ 00:07:08.442 12:49:11 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:08.442 00:07:08.442 real 0m13.053s 00:07:08.442 user 0m12.089s 00:07:08.442 sys 0m1.814s 00:07:08.442 12:49:11 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.442 12:49:11 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:08.442 ************************************ 00:07:08.442 END TEST skip_rpc 00:07:08.442 ************************************ 00:07:08.442 12:49:11 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:08.442 12:49:11 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:08.442 12:49:11 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:08.442 12:49:11 -- common/autotest_common.sh@10 -- # set +x 00:07:08.442 ************************************ 00:07:08.442 START TEST rpc_client 00:07:08.442 ************************************ 00:07:08.442 12:49:11 rpc_client -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:08.703 * Looking for test storage... 00:07:08.703 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1711 -- # lcov --version 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@345 -- # : 1 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@353 -- # local d=1 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@355 -- # echo 1 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@353 -- # local d=2 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@355 -- # echo 2 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:08.703 12:49:11 rpc_client -- scripts/common.sh@368 -- # return 0 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:08.703 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.703 --rc genhtml_branch_coverage=1 00:07:08.703 --rc genhtml_function_coverage=1 00:07:08.703 --rc genhtml_legend=1 00:07:08.703 --rc geninfo_all_blocks=1 00:07:08.703 --rc geninfo_unexecuted_blocks=1 00:07:08.703 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.703 ' 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:08.703 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.703 --rc genhtml_branch_coverage=1 00:07:08.703 --rc genhtml_function_coverage=1 00:07:08.703 --rc genhtml_legend=1 00:07:08.703 --rc geninfo_all_blocks=1 00:07:08.703 --rc geninfo_unexecuted_blocks=1 00:07:08.703 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.703 ' 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:08.703 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.703 --rc genhtml_branch_coverage=1 00:07:08.703 --rc genhtml_function_coverage=1 00:07:08.703 --rc genhtml_legend=1 00:07:08.703 --rc geninfo_all_blocks=1 00:07:08.703 --rc geninfo_unexecuted_blocks=1 00:07:08.703 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.703 ' 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:08.703 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.703 --rc genhtml_branch_coverage=1 00:07:08.703 --rc genhtml_function_coverage=1 00:07:08.703 --rc genhtml_legend=1 00:07:08.703 --rc geninfo_all_blocks=1 00:07:08.703 --rc geninfo_unexecuted_blocks=1 00:07:08.703 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.703 ' 00:07:08.703 12:49:11 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:08.703 OK 00:07:08.703 12:49:11 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:08.703 00:07:08.703 real 0m0.224s 00:07:08.703 user 0m0.105s 00:07:08.703 sys 0m0.137s 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.703 12:49:11 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:08.703 ************************************ 00:07:08.703 END TEST rpc_client 00:07:08.703 ************************************ 00:07:08.703 12:49:11 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:08.703 12:49:11 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:08.703 12:49:11 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:08.703 12:49:11 -- common/autotest_common.sh@10 -- # set +x 00:07:08.703 ************************************ 00:07:08.703 START TEST json_config 00:07:08.703 ************************************ 00:07:08.703 12:49:12 json_config -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1711 -- # lcov --version 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:08.963 12:49:12 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:08.963 12:49:12 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:08.963 12:49:12 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:08.963 12:49:12 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:07:08.963 12:49:12 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:07:08.963 12:49:12 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:07:08.963 12:49:12 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:07:08.963 12:49:12 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:07:08.963 12:49:12 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:07:08.963 12:49:12 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:07:08.963 12:49:12 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:08.963 12:49:12 json_config -- scripts/common.sh@344 -- # case "$op" in 00:07:08.963 12:49:12 json_config -- scripts/common.sh@345 -- # : 1 00:07:08.963 12:49:12 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:08.963 12:49:12 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:08.963 12:49:12 json_config -- scripts/common.sh@365 -- # decimal 1 00:07:08.963 12:49:12 json_config -- scripts/common.sh@353 -- # local d=1 00:07:08.963 12:49:12 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:08.963 12:49:12 json_config -- scripts/common.sh@355 -- # echo 1 00:07:08.963 12:49:12 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:07:08.963 12:49:12 json_config -- scripts/common.sh@366 -- # decimal 2 00:07:08.963 12:49:12 json_config -- scripts/common.sh@353 -- # local d=2 00:07:08.963 12:49:12 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:08.963 12:49:12 json_config -- scripts/common.sh@355 -- # echo 2 00:07:08.963 12:49:12 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:07:08.963 12:49:12 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:08.963 12:49:12 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:08.963 12:49:12 json_config -- scripts/common.sh@368 -- # return 0 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:08.963 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.963 --rc genhtml_branch_coverage=1 00:07:08.963 --rc genhtml_function_coverage=1 00:07:08.963 --rc genhtml_legend=1 00:07:08.963 --rc geninfo_all_blocks=1 00:07:08.963 --rc geninfo_unexecuted_blocks=1 00:07:08.963 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.963 ' 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:08.963 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.963 --rc genhtml_branch_coverage=1 00:07:08.963 --rc genhtml_function_coverage=1 00:07:08.963 --rc genhtml_legend=1 00:07:08.963 --rc geninfo_all_blocks=1 00:07:08.963 --rc geninfo_unexecuted_blocks=1 00:07:08.963 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.963 ' 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:08.963 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.963 --rc genhtml_branch_coverage=1 00:07:08.963 --rc genhtml_function_coverage=1 00:07:08.963 --rc genhtml_legend=1 00:07:08.963 --rc geninfo_all_blocks=1 00:07:08.963 --rc geninfo_unexecuted_blocks=1 00:07:08.963 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.963 ' 00:07:08.963 12:49:12 json_config -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:08.963 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:08.963 --rc genhtml_branch_coverage=1 00:07:08.963 --rc genhtml_function_coverage=1 00:07:08.963 --rc genhtml_legend=1 00:07:08.964 --rc geninfo_all_blocks=1 00:07:08.964 --rc geninfo_unexecuted_blocks=1 00:07:08.964 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:08.964 ' 00:07:08.964 12:49:12 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:08.964 12:49:12 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:07:08.964 12:49:12 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:08.964 12:49:12 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:08.964 12:49:12 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:08.964 12:49:12 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:08.964 12:49:12 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:08.964 12:49:12 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:08.964 12:49:12 json_config -- paths/export.sh@5 -- # export PATH 00:07:08.964 12:49:12 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@51 -- # : 0 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:08.964 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:08.964 12:49:12 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:08.964 12:49:12 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:08.964 12:49:12 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:08.964 12:49:12 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:08.964 12:49:12 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:08.964 12:49:12 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:08.964 12:49:12 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:07:08.964 WARNING: No tests are enabled so not running JSON configuration tests 00:07:08.964 12:49:12 json_config -- json_config/json_config.sh@28 -- # exit 0 00:07:08.964 00:07:08.964 real 0m0.202s 00:07:08.964 user 0m0.129s 00:07:08.964 sys 0m0.083s 00:07:08.964 12:49:12 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:08.964 12:49:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:08.964 ************************************ 00:07:08.964 END TEST json_config 00:07:08.964 ************************************ 00:07:08.964 12:49:12 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:08.964 12:49:12 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:08.964 12:49:12 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:08.964 12:49:12 -- common/autotest_common.sh@10 -- # set +x 00:07:09.225 ************************************ 00:07:09.225 START TEST json_config_extra_key 00:07:09.225 ************************************ 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1711 -- # lcov --version 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:09.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.225 --rc genhtml_branch_coverage=1 00:07:09.225 --rc genhtml_function_coverage=1 00:07:09.225 --rc genhtml_legend=1 00:07:09.225 --rc geninfo_all_blocks=1 00:07:09.225 --rc geninfo_unexecuted_blocks=1 00:07:09.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.225 ' 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:09.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.225 --rc genhtml_branch_coverage=1 00:07:09.225 --rc genhtml_function_coverage=1 00:07:09.225 --rc genhtml_legend=1 00:07:09.225 --rc geninfo_all_blocks=1 00:07:09.225 --rc geninfo_unexecuted_blocks=1 00:07:09.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.225 ' 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:09.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.225 --rc genhtml_branch_coverage=1 00:07:09.225 --rc genhtml_function_coverage=1 00:07:09.225 --rc genhtml_legend=1 00:07:09.225 --rc geninfo_all_blocks=1 00:07:09.225 --rc geninfo_unexecuted_blocks=1 00:07:09.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.225 ' 00:07:09.225 12:49:12 json_config_extra_key -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:09.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:09.225 --rc genhtml_branch_coverage=1 00:07:09.225 --rc genhtml_function_coverage=1 00:07:09.225 --rc genhtml_legend=1 00:07:09.225 --rc geninfo_all_blocks=1 00:07:09.225 --rc geninfo_unexecuted_blocks=1 00:07:09.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:09.225 ' 00:07:09.225 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:09.225 12:49:12 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:09.225 12:49:12 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:09.225 12:49:12 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:09.225 12:49:12 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:09.225 12:49:12 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:09.225 12:49:12 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:09.225 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:09.225 12:49:12 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:09.225 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:09.225 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:09.225 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:09.225 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:09.225 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:09.225 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:09.226 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:09.226 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:09.226 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:09.226 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:09.226 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:09.226 INFO: launching applications... 00:07:09.226 12:49:12 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=138554 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:09.226 Waiting for target to run... 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 138554 /var/tmp/spdk_tgt.sock 00:07:09.226 12:49:12 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 138554 ']' 00:07:09.226 12:49:12 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:09.226 12:49:12 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:09.226 12:49:12 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:09.226 12:49:12 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:09.226 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:09.226 12:49:12 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:09.226 12:49:12 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:09.226 [2024-12-05 12:49:12.522400] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:09.226 [2024-12-05 12:49:12.522477] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138554 ] 00:07:09.796 [2024-12-05 12:49:12.839551] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.796 [2024-12-05 12:49:12.852427] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.056 12:49:13 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:10.056 12:49:13 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:07:10.056 12:49:13 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:10.056 00:07:10.056 12:49:13 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:10.056 INFO: shutting down applications... 00:07:10.056 12:49:13 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:10.056 12:49:13 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:10.056 12:49:13 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:10.056 12:49:13 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 138554 ]] 00:07:10.056 12:49:13 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 138554 00:07:10.056 12:49:13 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:10.056 12:49:13 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:10.056 12:49:13 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 138554 00:07:10.057 12:49:13 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:10.638 12:49:13 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:10.638 12:49:13 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:10.638 12:49:13 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 138554 00:07:10.638 12:49:13 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:10.638 12:49:13 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:10.638 12:49:13 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:10.638 12:49:13 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:10.638 SPDK target shutdown done 00:07:10.638 12:49:13 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:10.638 Success 00:07:10.638 00:07:10.638 real 0m1.580s 00:07:10.638 user 0m1.277s 00:07:10.638 sys 0m0.440s 00:07:10.638 12:49:13 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:10.638 12:49:13 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:10.638 ************************************ 00:07:10.638 END TEST json_config_extra_key 00:07:10.638 ************************************ 00:07:10.638 12:49:13 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:10.638 12:49:13 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:10.638 12:49:13 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:10.638 12:49:13 -- common/autotest_common.sh@10 -- # set +x 00:07:10.898 ************************************ 00:07:10.898 START TEST alias_rpc 00:07:10.898 ************************************ 00:07:10.898 12:49:13 alias_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:10.898 * Looking for test storage... 00:07:10.898 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1711 -- # lcov --version 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@345 -- # : 1 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.898 12:49:14 alias_rpc -- scripts/common.sh@368 -- # return 0 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:10.898 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.898 --rc genhtml_branch_coverage=1 00:07:10.898 --rc genhtml_function_coverage=1 00:07:10.898 --rc genhtml_legend=1 00:07:10.898 --rc geninfo_all_blocks=1 00:07:10.898 --rc geninfo_unexecuted_blocks=1 00:07:10.898 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.898 ' 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:10.898 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.898 --rc genhtml_branch_coverage=1 00:07:10.898 --rc genhtml_function_coverage=1 00:07:10.898 --rc genhtml_legend=1 00:07:10.898 --rc geninfo_all_blocks=1 00:07:10.898 --rc geninfo_unexecuted_blocks=1 00:07:10.898 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.898 ' 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:10.898 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.898 --rc genhtml_branch_coverage=1 00:07:10.898 --rc genhtml_function_coverage=1 00:07:10.898 --rc genhtml_legend=1 00:07:10.898 --rc geninfo_all_blocks=1 00:07:10.898 --rc geninfo_unexecuted_blocks=1 00:07:10.898 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.898 ' 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:10.898 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.898 --rc genhtml_branch_coverage=1 00:07:10.898 --rc genhtml_function_coverage=1 00:07:10.898 --rc genhtml_legend=1 00:07:10.898 --rc geninfo_all_blocks=1 00:07:10.898 --rc geninfo_unexecuted_blocks=1 00:07:10.898 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.898 ' 00:07:10.898 12:49:14 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:10.898 12:49:14 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:10.898 12:49:14 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=138877 00:07:10.898 12:49:14 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 138877 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 138877 ']' 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:10.898 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:10.898 12:49:14 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:10.898 [2024-12-05 12:49:14.176862] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:10.898 [2024-12-05 12:49:14.176926] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid138877 ] 00:07:11.158 [2024-12-05 12:49:14.256612] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.158 [2024-12-05 12:49:14.279617] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.417 12:49:14 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:11.417 12:49:14 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:11.417 12:49:14 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:11.417 12:49:14 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 138877 00:07:11.417 12:49:14 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 138877 ']' 00:07:11.417 12:49:14 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 138877 00:07:11.417 12:49:14 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:07:11.417 12:49:14 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:11.417 12:49:14 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 138877 00:07:11.676 12:49:14 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:11.676 12:49:14 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:11.676 12:49:14 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 138877' 00:07:11.676 killing process with pid 138877 00:07:11.676 12:49:14 alias_rpc -- common/autotest_common.sh@973 -- # kill 138877 00:07:11.676 12:49:14 alias_rpc -- common/autotest_common.sh@978 -- # wait 138877 00:07:11.940 00:07:11.940 real 0m1.100s 00:07:11.940 user 0m1.073s 00:07:11.940 sys 0m0.471s 00:07:11.940 12:49:15 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:11.940 12:49:15 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:11.940 ************************************ 00:07:11.940 END TEST alias_rpc 00:07:11.940 ************************************ 00:07:11.940 12:49:15 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:07:11.940 12:49:15 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:11.940 12:49:15 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:11.940 12:49:15 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:11.940 12:49:15 -- common/autotest_common.sh@10 -- # set +x 00:07:11.940 ************************************ 00:07:11.940 START TEST spdkcli_tcp 00:07:11.940 ************************************ 00:07:11.940 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:11.940 * Looking for test storage... 00:07:11.940 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:07:11.940 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:11.940 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1711 -- # lcov --version 00:07:11.940 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:12.202 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:12.202 12:49:15 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:07:12.202 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:12.202 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:12.202 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.202 --rc genhtml_branch_coverage=1 00:07:12.202 --rc genhtml_function_coverage=1 00:07:12.202 --rc genhtml_legend=1 00:07:12.203 --rc geninfo_all_blocks=1 00:07:12.203 --rc geninfo_unexecuted_blocks=1 00:07:12.203 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.203 ' 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:12.203 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.203 --rc genhtml_branch_coverage=1 00:07:12.203 --rc genhtml_function_coverage=1 00:07:12.203 --rc genhtml_legend=1 00:07:12.203 --rc geninfo_all_blocks=1 00:07:12.203 --rc geninfo_unexecuted_blocks=1 00:07:12.203 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.203 ' 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:12.203 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.203 --rc genhtml_branch_coverage=1 00:07:12.203 --rc genhtml_function_coverage=1 00:07:12.203 --rc genhtml_legend=1 00:07:12.203 --rc geninfo_all_blocks=1 00:07:12.203 --rc geninfo_unexecuted_blocks=1 00:07:12.203 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.203 ' 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:12.203 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:12.203 --rc genhtml_branch_coverage=1 00:07:12.203 --rc genhtml_function_coverage=1 00:07:12.203 --rc genhtml_legend=1 00:07:12.203 --rc geninfo_all_blocks=1 00:07:12.203 --rc geninfo_unexecuted_blocks=1 00:07:12.203 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:12.203 ' 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=139198 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 139198 00:07:12.203 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 139198 ']' 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:12.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:12.203 12:49:15 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:12.203 [2024-12-05 12:49:15.369353] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:12.203 [2024-12-05 12:49:15.369437] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139198 ] 00:07:12.203 [2024-12-05 12:49:15.455770] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:12.203 [2024-12-05 12:49:15.479503] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.203 [2024-12-05 12:49:15.479503] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:12.462 12:49:15 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:12.462 12:49:15 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:07:12.462 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=139209 00:07:12.462 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:12.462 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:12.723 [ 00:07:12.723 "spdk_get_version", 00:07:12.723 "rpc_get_methods", 00:07:12.723 "notify_get_notifications", 00:07:12.723 "notify_get_types", 00:07:12.723 "trace_get_info", 00:07:12.723 "trace_get_tpoint_group_mask", 00:07:12.723 "trace_disable_tpoint_group", 00:07:12.723 "trace_enable_tpoint_group", 00:07:12.723 "trace_clear_tpoint_mask", 00:07:12.723 "trace_set_tpoint_mask", 00:07:12.723 "fsdev_set_opts", 00:07:12.723 "fsdev_get_opts", 00:07:12.723 "framework_get_pci_devices", 00:07:12.723 "framework_get_config", 00:07:12.723 "framework_get_subsystems", 00:07:12.723 "vfu_tgt_set_base_path", 00:07:12.723 "keyring_get_keys", 00:07:12.723 "iobuf_get_stats", 00:07:12.723 "iobuf_set_options", 00:07:12.723 "sock_get_default_impl", 00:07:12.723 "sock_set_default_impl", 00:07:12.723 "sock_impl_set_options", 00:07:12.723 "sock_impl_get_options", 00:07:12.723 "vmd_rescan", 00:07:12.723 "vmd_remove_device", 00:07:12.723 "vmd_enable", 00:07:12.723 "accel_get_stats", 00:07:12.723 "accel_set_options", 00:07:12.723 "accel_set_driver", 00:07:12.723 "accel_crypto_key_destroy", 00:07:12.723 "accel_crypto_keys_get", 00:07:12.723 "accel_crypto_key_create", 00:07:12.723 "accel_assign_opc", 00:07:12.723 "accel_get_module_info", 00:07:12.723 "accel_get_opc_assignments", 00:07:12.723 "bdev_get_histogram", 00:07:12.723 "bdev_enable_histogram", 00:07:12.723 "bdev_set_qos_limit", 00:07:12.723 "bdev_set_qd_sampling_period", 00:07:12.723 "bdev_get_bdevs", 00:07:12.723 "bdev_reset_iostat", 00:07:12.723 "bdev_get_iostat", 00:07:12.723 "bdev_examine", 00:07:12.723 "bdev_wait_for_examine", 00:07:12.723 "bdev_set_options", 00:07:12.723 "scsi_get_devices", 00:07:12.723 "thread_set_cpumask", 00:07:12.723 "scheduler_set_options", 00:07:12.723 "framework_get_governor", 00:07:12.723 "framework_get_scheduler", 00:07:12.723 "framework_set_scheduler", 00:07:12.723 "framework_get_reactors", 00:07:12.723 "thread_get_io_channels", 00:07:12.723 "thread_get_pollers", 00:07:12.723 "thread_get_stats", 00:07:12.723 "framework_monitor_context_switch", 00:07:12.723 "spdk_kill_instance", 00:07:12.723 "log_enable_timestamps", 00:07:12.723 "log_get_flags", 00:07:12.723 "log_clear_flag", 00:07:12.723 "log_set_flag", 00:07:12.723 "log_get_level", 00:07:12.723 "log_set_level", 00:07:12.723 "log_get_print_level", 00:07:12.723 "log_set_print_level", 00:07:12.723 "framework_enable_cpumask_locks", 00:07:12.723 "framework_disable_cpumask_locks", 00:07:12.723 "framework_wait_init", 00:07:12.723 "framework_start_init", 00:07:12.723 "virtio_blk_create_transport", 00:07:12.723 "virtio_blk_get_transports", 00:07:12.723 "vhost_controller_set_coalescing", 00:07:12.723 "vhost_get_controllers", 00:07:12.723 "vhost_delete_controller", 00:07:12.723 "vhost_create_blk_controller", 00:07:12.723 "vhost_scsi_controller_remove_target", 00:07:12.723 "vhost_scsi_controller_add_target", 00:07:12.723 "vhost_start_scsi_controller", 00:07:12.723 "vhost_create_scsi_controller", 00:07:12.723 "ublk_recover_disk", 00:07:12.723 "ublk_get_disks", 00:07:12.723 "ublk_stop_disk", 00:07:12.723 "ublk_start_disk", 00:07:12.723 "ublk_destroy_target", 00:07:12.723 "ublk_create_target", 00:07:12.723 "nbd_get_disks", 00:07:12.723 "nbd_stop_disk", 00:07:12.723 "nbd_start_disk", 00:07:12.723 "env_dpdk_get_mem_stats", 00:07:12.723 "nvmf_stop_mdns_prr", 00:07:12.723 "nvmf_publish_mdns_prr", 00:07:12.723 "nvmf_subsystem_get_listeners", 00:07:12.723 "nvmf_subsystem_get_qpairs", 00:07:12.723 "nvmf_subsystem_get_controllers", 00:07:12.723 "nvmf_get_stats", 00:07:12.723 "nvmf_get_transports", 00:07:12.723 "nvmf_create_transport", 00:07:12.723 "nvmf_get_targets", 00:07:12.723 "nvmf_delete_target", 00:07:12.723 "nvmf_create_target", 00:07:12.723 "nvmf_subsystem_allow_any_host", 00:07:12.723 "nvmf_subsystem_set_keys", 00:07:12.723 "nvmf_subsystem_remove_host", 00:07:12.723 "nvmf_subsystem_add_host", 00:07:12.723 "nvmf_ns_remove_host", 00:07:12.723 "nvmf_ns_add_host", 00:07:12.723 "nvmf_subsystem_remove_ns", 00:07:12.723 "nvmf_subsystem_set_ns_ana_group", 00:07:12.723 "nvmf_subsystem_add_ns", 00:07:12.723 "nvmf_subsystem_listener_set_ana_state", 00:07:12.723 "nvmf_discovery_get_referrals", 00:07:12.723 "nvmf_discovery_remove_referral", 00:07:12.723 "nvmf_discovery_add_referral", 00:07:12.723 "nvmf_subsystem_remove_listener", 00:07:12.723 "nvmf_subsystem_add_listener", 00:07:12.723 "nvmf_delete_subsystem", 00:07:12.723 "nvmf_create_subsystem", 00:07:12.723 "nvmf_get_subsystems", 00:07:12.723 "nvmf_set_crdt", 00:07:12.723 "nvmf_set_config", 00:07:12.723 "nvmf_set_max_subsystems", 00:07:12.723 "iscsi_get_histogram", 00:07:12.723 "iscsi_enable_histogram", 00:07:12.723 "iscsi_set_options", 00:07:12.723 "iscsi_get_auth_groups", 00:07:12.723 "iscsi_auth_group_remove_secret", 00:07:12.723 "iscsi_auth_group_add_secret", 00:07:12.723 "iscsi_delete_auth_group", 00:07:12.723 "iscsi_create_auth_group", 00:07:12.723 "iscsi_set_discovery_auth", 00:07:12.723 "iscsi_get_options", 00:07:12.723 "iscsi_target_node_request_logout", 00:07:12.723 "iscsi_target_node_set_redirect", 00:07:12.723 "iscsi_target_node_set_auth", 00:07:12.723 "iscsi_target_node_add_lun", 00:07:12.723 "iscsi_get_stats", 00:07:12.723 "iscsi_get_connections", 00:07:12.723 "iscsi_portal_group_set_auth", 00:07:12.723 "iscsi_start_portal_group", 00:07:12.723 "iscsi_delete_portal_group", 00:07:12.723 "iscsi_create_portal_group", 00:07:12.723 "iscsi_get_portal_groups", 00:07:12.723 "iscsi_delete_target_node", 00:07:12.723 "iscsi_target_node_remove_pg_ig_maps", 00:07:12.723 "iscsi_target_node_add_pg_ig_maps", 00:07:12.723 "iscsi_create_target_node", 00:07:12.723 "iscsi_get_target_nodes", 00:07:12.723 "iscsi_delete_initiator_group", 00:07:12.723 "iscsi_initiator_group_remove_initiators", 00:07:12.723 "iscsi_initiator_group_add_initiators", 00:07:12.723 "iscsi_create_initiator_group", 00:07:12.723 "iscsi_get_initiator_groups", 00:07:12.723 "fsdev_aio_delete", 00:07:12.723 "fsdev_aio_create", 00:07:12.723 "keyring_linux_set_options", 00:07:12.723 "keyring_file_remove_key", 00:07:12.723 "keyring_file_add_key", 00:07:12.723 "vfu_virtio_create_fs_endpoint", 00:07:12.723 "vfu_virtio_create_scsi_endpoint", 00:07:12.723 "vfu_virtio_scsi_remove_target", 00:07:12.723 "vfu_virtio_scsi_add_target", 00:07:12.723 "vfu_virtio_create_blk_endpoint", 00:07:12.723 "vfu_virtio_delete_endpoint", 00:07:12.723 "iaa_scan_accel_module", 00:07:12.723 "dsa_scan_accel_module", 00:07:12.723 "ioat_scan_accel_module", 00:07:12.723 "accel_error_inject_error", 00:07:12.723 "bdev_iscsi_delete", 00:07:12.723 "bdev_iscsi_create", 00:07:12.723 "bdev_iscsi_set_options", 00:07:12.723 "bdev_virtio_attach_controller", 00:07:12.723 "bdev_virtio_scsi_get_devices", 00:07:12.723 "bdev_virtio_detach_controller", 00:07:12.723 "bdev_virtio_blk_set_hotplug", 00:07:12.723 "bdev_ftl_set_property", 00:07:12.723 "bdev_ftl_get_properties", 00:07:12.723 "bdev_ftl_get_stats", 00:07:12.723 "bdev_ftl_unmap", 00:07:12.723 "bdev_ftl_unload", 00:07:12.723 "bdev_ftl_delete", 00:07:12.723 "bdev_ftl_load", 00:07:12.723 "bdev_ftl_create", 00:07:12.723 "bdev_aio_delete", 00:07:12.723 "bdev_aio_rescan", 00:07:12.723 "bdev_aio_create", 00:07:12.723 "blobfs_create", 00:07:12.723 "blobfs_detect", 00:07:12.723 "blobfs_set_cache_size", 00:07:12.723 "bdev_zone_block_delete", 00:07:12.723 "bdev_zone_block_create", 00:07:12.723 "bdev_delay_delete", 00:07:12.723 "bdev_delay_create", 00:07:12.723 "bdev_delay_update_latency", 00:07:12.723 "bdev_split_delete", 00:07:12.723 "bdev_split_create", 00:07:12.723 "bdev_error_inject_error", 00:07:12.723 "bdev_error_delete", 00:07:12.723 "bdev_error_create", 00:07:12.723 "bdev_raid_set_options", 00:07:12.723 "bdev_raid_remove_base_bdev", 00:07:12.723 "bdev_raid_add_base_bdev", 00:07:12.723 "bdev_raid_delete", 00:07:12.723 "bdev_raid_create", 00:07:12.723 "bdev_raid_get_bdevs", 00:07:12.723 "bdev_lvol_set_parent_bdev", 00:07:12.723 "bdev_lvol_set_parent", 00:07:12.723 "bdev_lvol_check_shallow_copy", 00:07:12.723 "bdev_lvol_start_shallow_copy", 00:07:12.723 "bdev_lvol_grow_lvstore", 00:07:12.723 "bdev_lvol_get_lvols", 00:07:12.723 "bdev_lvol_get_lvstores", 00:07:12.723 "bdev_lvol_delete", 00:07:12.723 "bdev_lvol_set_read_only", 00:07:12.723 "bdev_lvol_resize", 00:07:12.723 "bdev_lvol_decouple_parent", 00:07:12.723 "bdev_lvol_inflate", 00:07:12.723 "bdev_lvol_rename", 00:07:12.724 "bdev_lvol_clone_bdev", 00:07:12.724 "bdev_lvol_clone", 00:07:12.724 "bdev_lvol_snapshot", 00:07:12.724 "bdev_lvol_create", 00:07:12.724 "bdev_lvol_delete_lvstore", 00:07:12.724 "bdev_lvol_rename_lvstore", 00:07:12.724 "bdev_lvol_create_lvstore", 00:07:12.724 "bdev_passthru_delete", 00:07:12.724 "bdev_passthru_create", 00:07:12.724 "bdev_nvme_cuse_unregister", 00:07:12.724 "bdev_nvme_cuse_register", 00:07:12.724 "bdev_opal_new_user", 00:07:12.724 "bdev_opal_set_lock_state", 00:07:12.724 "bdev_opal_delete", 00:07:12.724 "bdev_opal_get_info", 00:07:12.724 "bdev_opal_create", 00:07:12.724 "bdev_nvme_opal_revert", 00:07:12.724 "bdev_nvme_opal_init", 00:07:12.724 "bdev_nvme_send_cmd", 00:07:12.724 "bdev_nvme_set_keys", 00:07:12.724 "bdev_nvme_get_path_iostat", 00:07:12.724 "bdev_nvme_get_mdns_discovery_info", 00:07:12.724 "bdev_nvme_stop_mdns_discovery", 00:07:12.724 "bdev_nvme_start_mdns_discovery", 00:07:12.724 "bdev_nvme_set_multipath_policy", 00:07:12.724 "bdev_nvme_set_preferred_path", 00:07:12.724 "bdev_nvme_get_io_paths", 00:07:12.724 "bdev_nvme_remove_error_injection", 00:07:12.724 "bdev_nvme_add_error_injection", 00:07:12.724 "bdev_nvme_get_discovery_info", 00:07:12.724 "bdev_nvme_stop_discovery", 00:07:12.724 "bdev_nvme_start_discovery", 00:07:12.724 "bdev_nvme_get_controller_health_info", 00:07:12.724 "bdev_nvme_disable_controller", 00:07:12.724 "bdev_nvme_enable_controller", 00:07:12.724 "bdev_nvme_reset_controller", 00:07:12.724 "bdev_nvme_get_transport_statistics", 00:07:12.724 "bdev_nvme_apply_firmware", 00:07:12.724 "bdev_nvme_detach_controller", 00:07:12.724 "bdev_nvme_get_controllers", 00:07:12.724 "bdev_nvme_attach_controller", 00:07:12.724 "bdev_nvme_set_hotplug", 00:07:12.724 "bdev_nvme_set_options", 00:07:12.724 "bdev_null_resize", 00:07:12.724 "bdev_null_delete", 00:07:12.724 "bdev_null_create", 00:07:12.724 "bdev_malloc_delete", 00:07:12.724 "bdev_malloc_create" 00:07:12.724 ] 00:07:12.724 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:12.724 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:12.724 12:49:15 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 139198 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 139198 ']' 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 139198 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 139198 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 139198' 00:07:12.724 killing process with pid 139198 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 139198 00:07:12.724 12:49:15 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 139198 00:07:12.984 00:07:12.984 real 0m1.122s 00:07:12.984 user 0m1.863s 00:07:12.984 sys 0m0.525s 00:07:12.984 12:49:16 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:12.984 12:49:16 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:12.984 ************************************ 00:07:12.984 END TEST spdkcli_tcp 00:07:12.984 ************************************ 00:07:13.245 12:49:16 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:13.245 12:49:16 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:13.245 12:49:16 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:13.245 12:49:16 -- common/autotest_common.sh@10 -- # set +x 00:07:13.245 ************************************ 00:07:13.245 START TEST dpdk_mem_utility 00:07:13.245 ************************************ 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:13.245 * Looking for test storage... 00:07:13.245 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1711 -- # lcov --version 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:13.245 12:49:16 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:13.245 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:13.245 --rc genhtml_branch_coverage=1 00:07:13.245 --rc genhtml_function_coverage=1 00:07:13.245 --rc genhtml_legend=1 00:07:13.245 --rc geninfo_all_blocks=1 00:07:13.245 --rc geninfo_unexecuted_blocks=1 00:07:13.245 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:13.245 ' 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:13.245 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:13.245 --rc genhtml_branch_coverage=1 00:07:13.245 --rc genhtml_function_coverage=1 00:07:13.245 --rc genhtml_legend=1 00:07:13.245 --rc geninfo_all_blocks=1 00:07:13.245 --rc geninfo_unexecuted_blocks=1 00:07:13.245 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:13.245 ' 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:13.245 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:13.245 --rc genhtml_branch_coverage=1 00:07:13.245 --rc genhtml_function_coverage=1 00:07:13.245 --rc genhtml_legend=1 00:07:13.245 --rc geninfo_all_blocks=1 00:07:13.245 --rc geninfo_unexecuted_blocks=1 00:07:13.245 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:13.245 ' 00:07:13.245 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:13.245 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:13.245 --rc genhtml_branch_coverage=1 00:07:13.245 --rc genhtml_function_coverage=1 00:07:13.245 --rc genhtml_legend=1 00:07:13.245 --rc geninfo_all_blocks=1 00:07:13.245 --rc geninfo_unexecuted_blocks=1 00:07:13.245 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:13.245 ' 00:07:13.246 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:13.246 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=139541 00:07:13.246 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 139541 00:07:13.246 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:13.246 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 139541 ']' 00:07:13.246 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:13.246 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:13.246 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:13.246 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:13.246 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:13.246 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:13.506 [2024-12-05 12:49:16.560563] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:13.506 [2024-12-05 12:49:16.560628] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139541 ] 00:07:13.506 [2024-12-05 12:49:16.643692] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.506 [2024-12-05 12:49:16.666196] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.768 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:13.768 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:07:13.768 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:13.768 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:13.768 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:13.768 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:13.768 { 00:07:13.768 "filename": "/tmp/spdk_mem_dump.txt" 00:07:13.768 } 00:07:13.768 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:13.768 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:13.768 DPDK memory size 818.000000 MiB in 1 heap(s) 00:07:13.768 1 heaps totaling size 818.000000 MiB 00:07:13.768 size: 818.000000 MiB heap id: 0 00:07:13.768 end heaps---------- 00:07:13.768 9 mempools totaling size 603.782043 MiB 00:07:13.768 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:13.768 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:13.768 size: 100.555481 MiB name: bdev_io_139541 00:07:13.768 size: 50.003479 MiB name: msgpool_139541 00:07:13.768 size: 36.509338 MiB name: fsdev_io_139541 00:07:13.768 size: 21.763794 MiB name: PDU_Pool 00:07:13.768 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:13.768 size: 4.133484 MiB name: evtpool_139541 00:07:13.768 size: 0.026123 MiB name: Session_Pool 00:07:13.768 end mempools------- 00:07:13.768 6 memzones totaling size 4.142822 MiB 00:07:13.768 size: 1.000366 MiB name: RG_ring_0_139541 00:07:13.768 size: 1.000366 MiB name: RG_ring_1_139541 00:07:13.768 size: 1.000366 MiB name: RG_ring_4_139541 00:07:13.768 size: 1.000366 MiB name: RG_ring_5_139541 00:07:13.768 size: 0.125366 MiB name: RG_ring_2_139541 00:07:13.768 size: 0.015991 MiB name: RG_ring_3_139541 00:07:13.768 end memzones------- 00:07:13.768 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:13.768 heap id: 0 total size: 818.000000 MiB number of busy elements: 44 number of free elements: 15 00:07:13.768 list of free elements. size: 10.852478 MiB 00:07:13.768 element at address: 0x200019200000 with size: 0.999878 MiB 00:07:13.768 element at address: 0x200019400000 with size: 0.999878 MiB 00:07:13.768 element at address: 0x200000400000 with size: 0.998535 MiB 00:07:13.768 element at address: 0x200032000000 with size: 0.994446 MiB 00:07:13.768 element at address: 0x200008000000 with size: 0.959839 MiB 00:07:13.768 element at address: 0x200012c00000 with size: 0.944275 MiB 00:07:13.768 element at address: 0x200019600000 with size: 0.936584 MiB 00:07:13.768 element at address: 0x200000200000 with size: 0.717346 MiB 00:07:13.768 element at address: 0x20001ae00000 with size: 0.582886 MiB 00:07:13.768 element at address: 0x200000c00000 with size: 0.495422 MiB 00:07:13.768 element at address: 0x200003e00000 with size: 0.490723 MiB 00:07:13.768 element at address: 0x200019800000 with size: 0.485657 MiB 00:07:13.768 element at address: 0x200010600000 with size: 0.481934 MiB 00:07:13.768 element at address: 0x200028200000 with size: 0.410034 MiB 00:07:13.768 element at address: 0x200000800000 with size: 0.355042 MiB 00:07:13.768 list of standard malloc elements. size: 199.218628 MiB 00:07:13.768 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:07:13.768 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:07:13.768 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:13.768 element at address: 0x2000194fff80 with size: 1.000122 MiB 00:07:13.768 element at address: 0x2000196fff80 with size: 1.000122 MiB 00:07:13.768 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:13.768 element at address: 0x2000196eff00 with size: 0.062622 MiB 00:07:13.768 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:13.768 element at address: 0x2000196efdc0 with size: 0.000305 MiB 00:07:13.768 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20000085b040 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20000085b100 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000008df880 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200000cff000 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20001067b600 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200012cf1bc0 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000196efc40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000196efd00 with size: 0.000183 MiB 00:07:13.768 element at address: 0x2000198bc740 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20001ae95380 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20001ae95440 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200028268f80 with size: 0.000183 MiB 00:07:13.768 element at address: 0x200028269040 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20002826fc40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20002826fe40 with size: 0.000183 MiB 00:07:13.768 element at address: 0x20002826ff00 with size: 0.000183 MiB 00:07:13.768 list of memzone associated elements. size: 607.928894 MiB 00:07:13.768 element at address: 0x20001ae95500 with size: 211.416748 MiB 00:07:13.768 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:13.768 element at address: 0x20002826ffc0 with size: 157.562561 MiB 00:07:13.768 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:13.768 element at address: 0x200012df1e80 with size: 100.055054 MiB 00:07:13.768 associated memzone info: size: 100.054932 MiB name: MP_bdev_io_139541_0 00:07:13.768 element at address: 0x200000dff380 with size: 48.003052 MiB 00:07:13.768 associated memzone info: size: 48.002930 MiB name: MP_msgpool_139541_0 00:07:13.768 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:07:13.768 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_139541_0 00:07:13.768 element at address: 0x2000199be940 with size: 20.255554 MiB 00:07:13.768 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:13.768 element at address: 0x2000321feb40 with size: 18.005066 MiB 00:07:13.768 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:13.768 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:07:13.768 associated memzone info: size: 3.000122 MiB name: MP_evtpool_139541_0 00:07:13.768 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:07:13.768 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_139541 00:07:13.768 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:07:13.768 associated memzone info: size: 1.007996 MiB name: MP_evtpool_139541 00:07:13.768 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:07:13.768 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:13.768 element at address: 0x2000198bc800 with size: 1.008118 MiB 00:07:13.768 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:13.768 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:07:13.768 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:13.768 element at address: 0x200003efde40 with size: 1.008118 MiB 00:07:13.768 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:13.768 element at address: 0x200000cff180 with size: 1.000488 MiB 00:07:13.768 associated memzone info: size: 1.000366 MiB name: RG_ring_0_139541 00:07:13.768 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:07:13.768 associated memzone info: size: 1.000366 MiB name: RG_ring_1_139541 00:07:13.768 element at address: 0x200012cf1c80 with size: 1.000488 MiB 00:07:13.768 associated memzone info: size: 1.000366 MiB name: RG_ring_4_139541 00:07:13.768 element at address: 0x2000320fe940 with size: 1.000488 MiB 00:07:13.768 associated memzone info: size: 1.000366 MiB name: RG_ring_5_139541 00:07:13.768 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:07:13.768 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_139541 00:07:13.768 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:07:13.768 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_139541 00:07:13.768 element at address: 0x20001067b780 with size: 0.500488 MiB 00:07:13.768 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:13.768 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:07:13.769 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:13.769 element at address: 0x20001987c540 with size: 0.250488 MiB 00:07:13.769 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:13.769 element at address: 0x2000002b7a40 with size: 0.125488 MiB 00:07:13.769 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_139541 00:07:13.769 element at address: 0x2000008df940 with size: 0.125488 MiB 00:07:13.769 associated memzone info: size: 0.125366 MiB name: RG_ring_2_139541 00:07:13.769 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:07:13.769 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:13.769 element at address: 0x200028269100 with size: 0.023743 MiB 00:07:13.769 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:13.769 element at address: 0x2000008db680 with size: 0.016113 MiB 00:07:13.769 associated memzone info: size: 0.015991 MiB name: RG_ring_3_139541 00:07:13.769 element at address: 0x20002826f240 with size: 0.002441 MiB 00:07:13.769 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:13.769 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:07:13.769 associated memzone info: size: 0.000183 MiB name: MP_msgpool_139541 00:07:13.769 element at address: 0x2000008db480 with size: 0.000305 MiB 00:07:13.769 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_139541 00:07:13.769 element at address: 0x20000085af00 with size: 0.000305 MiB 00:07:13.769 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_139541 00:07:13.769 element at address: 0x20002826fd00 with size: 0.000305 MiB 00:07:13.769 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:13.769 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:13.769 12:49:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 139541 00:07:13.769 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 139541 ']' 00:07:13.769 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 139541 00:07:13.769 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:07:13.769 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:13.769 12:49:16 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 139541 00:07:13.769 12:49:17 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:13.769 12:49:17 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:13.769 12:49:17 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 139541' 00:07:13.769 killing process with pid 139541 00:07:13.769 12:49:17 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 139541 00:07:13.769 12:49:17 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 139541 00:07:14.030 00:07:14.030 real 0m0.995s 00:07:14.030 user 0m0.913s 00:07:14.030 sys 0m0.458s 00:07:14.030 12:49:17 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:14.030 12:49:17 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:14.030 ************************************ 00:07:14.030 END TEST dpdk_mem_utility 00:07:14.030 ************************************ 00:07:14.291 12:49:17 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:14.291 12:49:17 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:14.291 12:49:17 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:14.291 12:49:17 -- common/autotest_common.sh@10 -- # set +x 00:07:14.291 ************************************ 00:07:14.291 START TEST event 00:07:14.291 ************************************ 00:07:14.291 12:49:17 event -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:14.291 * Looking for test storage... 00:07:14.291 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:14.291 12:49:17 event -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:14.291 12:49:17 event -- common/autotest_common.sh@1711 -- # lcov --version 00:07:14.291 12:49:17 event -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:14.291 12:49:17 event -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:14.291 12:49:17 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:14.291 12:49:17 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:14.291 12:49:17 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:14.291 12:49:17 event -- scripts/common.sh@336 -- # IFS=.-: 00:07:14.552 12:49:17 event -- scripts/common.sh@336 -- # read -ra ver1 00:07:14.552 12:49:17 event -- scripts/common.sh@337 -- # IFS=.-: 00:07:14.552 12:49:17 event -- scripts/common.sh@337 -- # read -ra ver2 00:07:14.552 12:49:17 event -- scripts/common.sh@338 -- # local 'op=<' 00:07:14.552 12:49:17 event -- scripts/common.sh@340 -- # ver1_l=2 00:07:14.552 12:49:17 event -- scripts/common.sh@341 -- # ver2_l=1 00:07:14.552 12:49:17 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:14.552 12:49:17 event -- scripts/common.sh@344 -- # case "$op" in 00:07:14.552 12:49:17 event -- scripts/common.sh@345 -- # : 1 00:07:14.552 12:49:17 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:14.552 12:49:17 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:14.552 12:49:17 event -- scripts/common.sh@365 -- # decimal 1 00:07:14.552 12:49:17 event -- scripts/common.sh@353 -- # local d=1 00:07:14.552 12:49:17 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:14.552 12:49:17 event -- scripts/common.sh@355 -- # echo 1 00:07:14.552 12:49:17 event -- scripts/common.sh@365 -- # ver1[v]=1 00:07:14.552 12:49:17 event -- scripts/common.sh@366 -- # decimal 2 00:07:14.552 12:49:17 event -- scripts/common.sh@353 -- # local d=2 00:07:14.552 12:49:17 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:14.552 12:49:17 event -- scripts/common.sh@355 -- # echo 2 00:07:14.552 12:49:17 event -- scripts/common.sh@366 -- # ver2[v]=2 00:07:14.552 12:49:17 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:14.552 12:49:17 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:14.552 12:49:17 event -- scripts/common.sh@368 -- # return 0 00:07:14.552 12:49:17 event -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:14.552 12:49:17 event -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:14.552 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:14.552 --rc genhtml_branch_coverage=1 00:07:14.552 --rc genhtml_function_coverage=1 00:07:14.552 --rc genhtml_legend=1 00:07:14.552 --rc geninfo_all_blocks=1 00:07:14.552 --rc geninfo_unexecuted_blocks=1 00:07:14.552 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:14.552 ' 00:07:14.552 12:49:17 event -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:14.552 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:14.552 --rc genhtml_branch_coverage=1 00:07:14.552 --rc genhtml_function_coverage=1 00:07:14.552 --rc genhtml_legend=1 00:07:14.552 --rc geninfo_all_blocks=1 00:07:14.552 --rc geninfo_unexecuted_blocks=1 00:07:14.552 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:14.552 ' 00:07:14.552 12:49:17 event -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:14.552 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:14.552 --rc genhtml_branch_coverage=1 00:07:14.552 --rc genhtml_function_coverage=1 00:07:14.552 --rc genhtml_legend=1 00:07:14.552 --rc geninfo_all_blocks=1 00:07:14.552 --rc geninfo_unexecuted_blocks=1 00:07:14.552 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:14.552 ' 00:07:14.552 12:49:17 event -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:14.552 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:14.552 --rc genhtml_branch_coverage=1 00:07:14.552 --rc genhtml_function_coverage=1 00:07:14.552 --rc genhtml_legend=1 00:07:14.552 --rc geninfo_all_blocks=1 00:07:14.552 --rc geninfo_unexecuted_blocks=1 00:07:14.553 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:14.553 ' 00:07:14.553 12:49:17 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:14.553 12:49:17 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:14.553 12:49:17 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:14.553 12:49:17 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:07:14.553 12:49:17 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:14.553 12:49:17 event -- common/autotest_common.sh@10 -- # set +x 00:07:14.553 ************************************ 00:07:14.553 START TEST event_perf 00:07:14.553 ************************************ 00:07:14.553 12:49:17 event.event_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:14.553 Running I/O for 1 seconds...[2024-12-05 12:49:17.677443] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:14.553 [2024-12-05 12:49:17.677541] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139698 ] 00:07:14.553 [2024-12-05 12:49:17.747936] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:14.553 [2024-12-05 12:49:17.774371] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:14.553 [2024-12-05 12:49:17.774482] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:14.553 [2024-12-05 12:49:17.774566] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.553 [2024-12-05 12:49:17.774567] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:15.494 Running I/O for 1 seconds... 00:07:15.494 lcore 0: 202023 00:07:15.495 lcore 1: 202023 00:07:15.495 lcore 2: 202025 00:07:15.495 lcore 3: 202023 00:07:15.495 done. 00:07:15.495 00:07:15.495 real 0m1.145s 00:07:15.495 user 0m4.062s 00:07:15.495 sys 0m0.080s 00:07:15.495 12:49:18 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:15.495 12:49:18 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:15.495 ************************************ 00:07:15.495 END TEST event_perf 00:07:15.495 ************************************ 00:07:15.755 12:49:18 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:15.755 12:49:18 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:15.755 12:49:18 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:15.755 12:49:18 event -- common/autotest_common.sh@10 -- # set +x 00:07:15.755 ************************************ 00:07:15.755 START TEST event_reactor 00:07:15.755 ************************************ 00:07:15.755 12:49:18 event.event_reactor -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:15.755 [2024-12-05 12:49:18.912454] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:15.756 [2024-12-05 12:49:18.912537] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid139907 ] 00:07:15.756 [2024-12-05 12:49:19.001987] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.756 [2024-12-05 12:49:19.025786] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.140 test_start 00:07:17.140 oneshot 00:07:17.140 tick 100 00:07:17.140 tick 100 00:07:17.140 tick 250 00:07:17.140 tick 100 00:07:17.140 tick 100 00:07:17.140 tick 100 00:07:17.140 tick 250 00:07:17.140 tick 500 00:07:17.140 tick 100 00:07:17.140 tick 100 00:07:17.140 tick 250 00:07:17.140 tick 100 00:07:17.140 tick 100 00:07:17.140 test_end 00:07:17.140 00:07:17.140 real 0m1.162s 00:07:17.140 user 0m1.066s 00:07:17.141 sys 0m0.091s 00:07:17.141 12:49:20 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:17.141 12:49:20 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:17.141 ************************************ 00:07:17.141 END TEST event_reactor 00:07:17.141 ************************************ 00:07:17.141 12:49:20 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:17.141 12:49:20 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:17.141 12:49:20 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:17.141 12:49:20 event -- common/autotest_common.sh@10 -- # set +x 00:07:17.141 ************************************ 00:07:17.141 START TEST event_reactor_perf 00:07:17.141 ************************************ 00:07:17.141 12:49:20 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:17.141 [2024-12-05 12:49:20.160258] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:17.141 [2024-12-05 12:49:20.160341] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140189 ] 00:07:17.141 [2024-12-05 12:49:20.247484] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.141 [2024-12-05 12:49:20.271808] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.082 test_start 00:07:18.082 test_end 00:07:18.082 Performance: 945100 events per second 00:07:18.082 00:07:18.082 real 0m1.162s 00:07:18.082 user 0m1.060s 00:07:18.082 sys 0m0.098s 00:07:18.082 12:49:21 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:18.082 12:49:21 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:18.082 ************************************ 00:07:18.082 END TEST event_reactor_perf 00:07:18.082 ************************************ 00:07:18.082 12:49:21 event -- event/event.sh@49 -- # uname -s 00:07:18.082 12:49:21 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:18.082 12:49:21 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:18.082 12:49:21 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:18.082 12:49:21 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:18.082 12:49:21 event -- common/autotest_common.sh@10 -- # set +x 00:07:18.082 ************************************ 00:07:18.082 START TEST event_scheduler 00:07:18.082 ************************************ 00:07:18.082 12:49:21 event.event_scheduler -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:18.343 * Looking for test storage... 00:07:18.343 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:18.343 12:49:21 event.event_scheduler -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:18.343 12:49:21 event.event_scheduler -- common/autotest_common.sh@1711 -- # lcov --version 00:07:18.343 12:49:21 event.event_scheduler -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:18.343 12:49:21 event.event_scheduler -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:18.343 12:49:21 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:07:18.344 12:49:21 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:07:18.344 12:49:21 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:18.344 12:49:21 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:18.344 12:49:21 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:18.344 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:18.344 --rc genhtml_branch_coverage=1 00:07:18.344 --rc genhtml_function_coverage=1 00:07:18.344 --rc genhtml_legend=1 00:07:18.344 --rc geninfo_all_blocks=1 00:07:18.344 --rc geninfo_unexecuted_blocks=1 00:07:18.344 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:18.344 ' 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:18.344 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:18.344 --rc genhtml_branch_coverage=1 00:07:18.344 --rc genhtml_function_coverage=1 00:07:18.344 --rc genhtml_legend=1 00:07:18.344 --rc geninfo_all_blocks=1 00:07:18.344 --rc geninfo_unexecuted_blocks=1 00:07:18.344 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:18.344 ' 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:18.344 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:18.344 --rc genhtml_branch_coverage=1 00:07:18.344 --rc genhtml_function_coverage=1 00:07:18.344 --rc genhtml_legend=1 00:07:18.344 --rc geninfo_all_blocks=1 00:07:18.344 --rc geninfo_unexecuted_blocks=1 00:07:18.344 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:18.344 ' 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:18.344 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:18.344 --rc genhtml_branch_coverage=1 00:07:18.344 --rc genhtml_function_coverage=1 00:07:18.344 --rc genhtml_legend=1 00:07:18.344 --rc geninfo_all_blocks=1 00:07:18.344 --rc geninfo_unexecuted_blocks=1 00:07:18.344 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:18.344 ' 00:07:18.344 12:49:21 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:18.344 12:49:21 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=140509 00:07:18.344 12:49:21 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:18.344 12:49:21 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:18.344 12:49:21 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 140509 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 140509 ']' 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:18.344 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:18.344 12:49:21 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.344 [2024-12-05 12:49:21.606132] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:18.344 [2024-12-05 12:49:21.606215] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid140509 ] 00:07:18.605 [2024-12-05 12:49:21.693333] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:18.605 [2024-12-05 12:49:21.719558] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.605 [2024-12-05 12:49:21.719671] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:18.605 [2024-12-05 12:49:21.719777] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:18.605 [2024-12-05 12:49:21.719778] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:07:18.605 12:49:21 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.605 [2024-12-05 12:49:21.784554] dpdk_governor.c: 178:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:18.605 [2024-12-05 12:49:21.784573] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:07:18.605 [2024-12-05 12:49:21.784585] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:18.605 [2024-12-05 12:49:21.784592] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:18.605 [2024-12-05 12:49:21.784599] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.605 12:49:21 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.605 [2024-12-05 12:49:21.853743] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.605 12:49:21 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:18.605 12:49:21 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:18.605 ************************************ 00:07:18.605 START TEST scheduler_create_thread 00:07:18.605 ************************************ 00:07:18.605 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:07:18.605 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:18.605 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.605 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.605 2 00:07:18.605 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.605 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:18.605 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.605 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 3 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 4 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 5 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 6 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 7 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 8 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 9 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 10 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:18.866 12:49:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:18.866 12:49:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.807 12:49:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:19.807 12:49:22 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:19.807 12:49:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:19.807 12:49:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.187 12:49:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:21.187 12:49:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:21.188 12:49:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:21.188 12:49:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:21.188 12:49:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:22.124 12:49:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:22.124 00:07:22.124 real 0m3.381s 00:07:22.124 user 0m0.028s 00:07:22.124 sys 0m0.004s 00:07:22.124 12:49:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:22.124 12:49:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:22.124 ************************************ 00:07:22.124 END TEST scheduler_create_thread 00:07:22.124 ************************************ 00:07:22.124 12:49:25 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:22.124 12:49:25 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 140509 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 140509 ']' 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 140509 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 140509 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 140509' 00:07:22.124 killing process with pid 140509 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 140509 00:07:22.124 12:49:25 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 140509 00:07:22.383 [2024-12-05 12:49:25.657473] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:22.642 00:07:22.642 real 0m4.467s 00:07:22.642 user 0m7.804s 00:07:22.642 sys 0m0.484s 00:07:22.642 12:49:25 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:22.642 12:49:25 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:22.642 ************************************ 00:07:22.642 END TEST event_scheduler 00:07:22.642 ************************************ 00:07:22.642 12:49:25 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:22.642 12:49:25 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:22.642 12:49:25 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:22.642 12:49:25 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:22.642 12:49:25 event -- common/autotest_common.sh@10 -- # set +x 00:07:22.642 ************************************ 00:07:22.642 START TEST app_repeat 00:07:22.642 ************************************ 00:07:22.642 12:49:25 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@19 -- # repeat_pid=141357 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 141357' 00:07:22.642 Process app_repeat pid: 141357 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:22.642 spdk_app_start Round 0 00:07:22.642 12:49:25 event.app_repeat -- event/event.sh@25 -- # waitforlisten 141357 /var/tmp/spdk-nbd.sock 00:07:22.642 12:49:25 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 141357 ']' 00:07:22.642 12:49:25 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:22.642 12:49:25 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:22.642 12:49:25 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:22.642 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:22.642 12:49:25 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:22.642 12:49:25 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:22.901 [2024-12-05 12:49:25.968987] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:22.901 [2024-12-05 12:49:25.969068] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid141357 ] 00:07:22.901 [2024-12-05 12:49:26.054758] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:22.901 [2024-12-05 12:49:26.078462] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.901 [2024-12-05 12:49:26.078463] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.901 12:49:26 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:22.901 12:49:26 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:22.901 12:49:26 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.160 Malloc0 00:07:23.160 12:49:26 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.420 Malloc1 00:07:23.420 12:49:26 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.420 12:49:26 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:23.679 /dev/nbd0 00:07:23.679 12:49:26 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:23.679 12:49:26 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:23.679 1+0 records in 00:07:23.679 1+0 records out 00:07:23.679 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226261 s, 18.1 MB/s 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:23.679 12:49:26 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:23.679 12:49:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.679 12:49:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.679 12:49:26 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:23.939 /dev/nbd1 00:07:23.939 12:49:27 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:23.939 12:49:27 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:23.939 1+0 records in 00:07:23.939 1+0 records out 00:07:23.939 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000296573 s, 13.8 MB/s 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:23.939 12:49:27 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:23.939 12:49:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.939 12:49:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.939 12:49:27 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:23.939 12:49:27 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.939 12:49:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.198 12:49:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:24.198 { 00:07:24.198 "nbd_device": "/dev/nbd0", 00:07:24.198 "bdev_name": "Malloc0" 00:07:24.198 }, 00:07:24.198 { 00:07:24.198 "nbd_device": "/dev/nbd1", 00:07:24.198 "bdev_name": "Malloc1" 00:07:24.198 } 00:07:24.198 ]' 00:07:24.198 12:49:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:24.198 { 00:07:24.199 "nbd_device": "/dev/nbd0", 00:07:24.199 "bdev_name": "Malloc0" 00:07:24.199 }, 00:07:24.199 { 00:07:24.199 "nbd_device": "/dev/nbd1", 00:07:24.199 "bdev_name": "Malloc1" 00:07:24.199 } 00:07:24.199 ]' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:24.199 /dev/nbd1' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:24.199 /dev/nbd1' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:24.199 256+0 records in 00:07:24.199 256+0 records out 00:07:24.199 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108821 s, 96.4 MB/s 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:24.199 256+0 records in 00:07:24.199 256+0 records out 00:07:24.199 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200539 s, 52.3 MB/s 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:24.199 256+0 records in 00:07:24.199 256+0 records out 00:07:24.199 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0219128 s, 47.9 MB/s 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.199 12:49:27 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.459 12:49:27 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.719 12:49:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.719 12:49:28 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:24.719 12:49:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.719 12:49:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:24.979 12:49:28 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:24.979 12:49:28 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:24.979 12:49:28 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:25.239 [2024-12-05 12:49:28.412940] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:25.239 [2024-12-05 12:49:28.432331] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.239 [2024-12-05 12:49:28.432332] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.239 [2024-12-05 12:49:28.472987] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:25.239 [2024-12-05 12:49:28.473029] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:28.546 12:49:31 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:28.546 12:49:31 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:28.546 spdk_app_start Round 1 00:07:28.546 12:49:31 event.app_repeat -- event/event.sh@25 -- # waitforlisten 141357 /var/tmp/spdk-nbd.sock 00:07:28.546 12:49:31 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 141357 ']' 00:07:28.546 12:49:31 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:28.546 12:49:31 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:28.546 12:49:31 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:28.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:28.546 12:49:31 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:28.546 12:49:31 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:28.546 12:49:31 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:28.546 12:49:31 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:28.546 12:49:31 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:28.546 Malloc0 00:07:28.546 12:49:31 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:28.806 Malloc1 00:07:28.806 12:49:31 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:28.806 12:49:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:28.806 /dev/nbd0 00:07:29.067 12:49:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:29.067 12:49:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.067 1+0 records in 00:07:29.067 1+0 records out 00:07:29.067 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225824 s, 18.1 MB/s 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:29.067 12:49:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.067 12:49:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.067 12:49:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:29.067 /dev/nbd1 00:07:29.067 12:49:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:29.067 12:49:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.067 1+0 records in 00:07:29.067 1+0 records out 00:07:29.067 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00034926 s, 11.7 MB/s 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:29.067 12:49:32 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:29.327 12:49:32 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:29.327 12:49:32 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:29.327 12:49:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.327 12:49:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.327 12:49:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:29.328 { 00:07:29.328 "nbd_device": "/dev/nbd0", 00:07:29.328 "bdev_name": "Malloc0" 00:07:29.328 }, 00:07:29.328 { 00:07:29.328 "nbd_device": "/dev/nbd1", 00:07:29.328 "bdev_name": "Malloc1" 00:07:29.328 } 00:07:29.328 ]' 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:29.328 { 00:07:29.328 "nbd_device": "/dev/nbd0", 00:07:29.328 "bdev_name": "Malloc0" 00:07:29.328 }, 00:07:29.328 { 00:07:29.328 "nbd_device": "/dev/nbd1", 00:07:29.328 "bdev_name": "Malloc1" 00:07:29.328 } 00:07:29.328 ]' 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:29.328 /dev/nbd1' 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:29.328 /dev/nbd1' 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:29.328 12:49:32 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:29.588 256+0 records in 00:07:29.588 256+0 records out 00:07:29.588 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104559 s, 100 MB/s 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:29.588 256+0 records in 00:07:29.588 256+0 records out 00:07:29.588 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200747 s, 52.2 MB/s 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:29.588 256+0 records in 00:07:29.588 256+0 records out 00:07:29.588 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213026 s, 49.2 MB/s 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.588 12:49:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.848 12:49:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.848 12:49:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:30.108 12:49:33 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:30.108 12:49:33 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:30.368 12:49:33 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:30.629 [2024-12-05 12:49:33.764332] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:30.629 [2024-12-05 12:49:33.784065] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.629 [2024-12-05 12:49:33.784066] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:30.629 [2024-12-05 12:49:33.826005] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:30.629 [2024-12-05 12:49:33.826049] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:33.929 12:49:36 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:33.929 12:49:36 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:33.929 spdk_app_start Round 2 00:07:33.929 12:49:36 event.app_repeat -- event/event.sh@25 -- # waitforlisten 141357 /var/tmp/spdk-nbd.sock 00:07:33.929 12:49:36 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 141357 ']' 00:07:33.929 12:49:36 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:33.929 12:49:36 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:33.929 12:49:36 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:33.929 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:33.929 12:49:36 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:33.929 12:49:36 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:33.929 12:49:36 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:33.929 12:49:36 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:33.929 12:49:36 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:33.929 Malloc0 00:07:33.929 12:49:37 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:33.929 Malloc1 00:07:33.929 12:49:37 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:33.929 12:49:37 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:33.930 12:49:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:34.190 /dev/nbd0 00:07:34.190 12:49:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:34.190 12:49:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:34.190 1+0 records in 00:07:34.190 1+0 records out 00:07:34.190 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00152891 s, 2.7 MB/s 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:34.190 12:49:37 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:34.190 12:49:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.190 12:49:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.190 12:49:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:34.451 /dev/nbd1 00:07:34.451 12:49:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:34.451 12:49:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:34.451 1+0 records in 00:07:34.451 1+0 records out 00:07:34.451 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254594 s, 16.1 MB/s 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:34.451 12:49:37 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:34.451 12:49:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.451 12:49:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.451 12:49:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:34.451 12:49:37 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.451 12:49:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:34.711 { 00:07:34.711 "nbd_device": "/dev/nbd0", 00:07:34.711 "bdev_name": "Malloc0" 00:07:34.711 }, 00:07:34.711 { 00:07:34.711 "nbd_device": "/dev/nbd1", 00:07:34.711 "bdev_name": "Malloc1" 00:07:34.711 } 00:07:34.711 ]' 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:34.711 { 00:07:34.711 "nbd_device": "/dev/nbd0", 00:07:34.711 "bdev_name": "Malloc0" 00:07:34.711 }, 00:07:34.711 { 00:07:34.711 "nbd_device": "/dev/nbd1", 00:07:34.711 "bdev_name": "Malloc1" 00:07:34.711 } 00:07:34.711 ]' 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:34.711 /dev/nbd1' 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:34.711 /dev/nbd1' 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:34.711 256+0 records in 00:07:34.711 256+0 records out 00:07:34.711 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104713 s, 100 MB/s 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:34.711 12:49:37 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:34.711 256+0 records in 00:07:34.711 256+0 records out 00:07:34.711 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.02004 s, 52.3 MB/s 00:07:34.711 12:49:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:34.711 12:49:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:34.972 256+0 records in 00:07:34.972 256+0 records out 00:07:34.972 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0216234 s, 48.5 MB/s 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:34.972 12:49:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:34.973 12:49:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:34.973 12:49:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.233 12:49:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:35.493 12:49:38 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:35.493 12:49:38 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:35.754 12:49:38 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:36.014 [2024-12-05 12:49:39.093110] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:36.014 [2024-12-05 12:49:39.112615] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.014 [2024-12-05 12:49:39.112615] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:36.014 [2024-12-05 12:49:39.153973] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:36.014 [2024-12-05 12:49:39.154014] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:39.317 12:49:41 event.app_repeat -- event/event.sh@38 -- # waitforlisten 141357 /var/tmp/spdk-nbd.sock 00:07:39.317 12:49:41 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 141357 ']' 00:07:39.317 12:49:41 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:39.318 12:49:41 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:39.318 12:49:41 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:39.318 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:39.318 12:49:41 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:39.318 12:49:41 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:39.318 12:49:42 event.app_repeat -- event/event.sh@39 -- # killprocess 141357 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 141357 ']' 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 141357 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 141357 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 141357' 00:07:39.318 killing process with pid 141357 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@973 -- # kill 141357 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@978 -- # wait 141357 00:07:39.318 spdk_app_start is called in Round 0. 00:07:39.318 Shutdown signal received, stop current app iteration 00:07:39.318 Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 reinitialization... 00:07:39.318 spdk_app_start is called in Round 1. 00:07:39.318 Shutdown signal received, stop current app iteration 00:07:39.318 Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 reinitialization... 00:07:39.318 spdk_app_start is called in Round 2. 00:07:39.318 Shutdown signal received, stop current app iteration 00:07:39.318 Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 reinitialization... 00:07:39.318 spdk_app_start is called in Round 3. 00:07:39.318 Shutdown signal received, stop current app iteration 00:07:39.318 12:49:42 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:39.318 12:49:42 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:39.318 00:07:39.318 real 0m16.407s 00:07:39.318 user 0m35.542s 00:07:39.318 sys 0m3.179s 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:39.318 12:49:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:39.318 ************************************ 00:07:39.318 END TEST app_repeat 00:07:39.318 ************************************ 00:07:39.318 12:49:42 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:39.318 12:49:42 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:39.318 12:49:42 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:39.318 12:49:42 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:39.318 12:49:42 event -- common/autotest_common.sh@10 -- # set +x 00:07:39.318 ************************************ 00:07:39.318 START TEST cpu_locks 00:07:39.318 ************************************ 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:39.318 * Looking for test storage... 00:07:39.318 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1711 -- # lcov --version 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:39.318 12:49:42 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:39.318 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.318 --rc genhtml_branch_coverage=1 00:07:39.318 --rc genhtml_function_coverage=1 00:07:39.318 --rc genhtml_legend=1 00:07:39.318 --rc geninfo_all_blocks=1 00:07:39.318 --rc geninfo_unexecuted_blocks=1 00:07:39.318 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.318 ' 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:39.318 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.318 --rc genhtml_branch_coverage=1 00:07:39.318 --rc genhtml_function_coverage=1 00:07:39.318 --rc genhtml_legend=1 00:07:39.318 --rc geninfo_all_blocks=1 00:07:39.318 --rc geninfo_unexecuted_blocks=1 00:07:39.318 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.318 ' 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:39.318 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.318 --rc genhtml_branch_coverage=1 00:07:39.318 --rc genhtml_function_coverage=1 00:07:39.318 --rc genhtml_legend=1 00:07:39.318 --rc geninfo_all_blocks=1 00:07:39.318 --rc geninfo_unexecuted_blocks=1 00:07:39.318 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.318 ' 00:07:39.318 12:49:42 event.cpu_locks -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:39.318 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.318 --rc genhtml_branch_coverage=1 00:07:39.318 --rc genhtml_function_coverage=1 00:07:39.318 --rc genhtml_legend=1 00:07:39.318 --rc geninfo_all_blocks=1 00:07:39.318 --rc geninfo_unexecuted_blocks=1 00:07:39.318 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.318 ' 00:07:39.318 12:49:42 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:39.318 12:49:42 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:39.318 12:49:42 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:39.579 12:49:42 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:39.579 12:49:42 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:39.579 12:49:42 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:39.579 12:49:42 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:39.579 ************************************ 00:07:39.579 START TEST default_locks 00:07:39.579 ************************************ 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=144501 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 144501 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 144501 ']' 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:39.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:39.579 12:49:42 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:39.579 [2024-12-05 12:49:42.698692] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:39.579 [2024-12-05 12:49:42.698752] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144501 ] 00:07:39.579 [2024-12-05 12:49:42.784273] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.579 [2024-12-05 12:49:42.805593] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.840 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:39.840 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:07:39.840 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 144501 00:07:39.840 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 144501 00:07:39.840 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:40.411 lslocks: write error 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 144501 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 144501 ']' 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 144501 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144501 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144501' 00:07:40.411 killing process with pid 144501 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 144501 00:07:40.411 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 144501 00:07:40.671 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 144501 00:07:40.671 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:07:40.671 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 144501 00:07:40.671 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 144501 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 144501 ']' 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:40.932 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:40.932 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (144501) - No such process 00:07:40.932 ERROR: process (pid: 144501) is no longer running 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:40.932 00:07:40.932 real 0m1.320s 00:07:40.932 user 0m1.284s 00:07:40.932 sys 0m0.638s 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:40.932 12:49:43 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:40.932 ************************************ 00:07:40.932 END TEST default_locks 00:07:40.932 ************************************ 00:07:40.932 12:49:44 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:40.932 12:49:44 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:40.932 12:49:44 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:40.932 12:49:44 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:40.932 ************************************ 00:07:40.932 START TEST default_locks_via_rpc 00:07:40.932 ************************************ 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=144731 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 144731 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 144731 ']' 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:40.932 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:40.932 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:40.932 [2024-12-05 12:49:44.098682] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:40.932 [2024-12-05 12:49:44.098748] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144731 ] 00:07:40.932 [2024-12-05 12:49:44.180522] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.932 [2024-12-05 12:49:44.203009] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 144731 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 144731 00:07:41.193 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:41.763 12:49:44 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 144731 00:07:41.763 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 144731 ']' 00:07:41.763 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 144731 00:07:41.763 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:07:41.763 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:41.763 12:49:44 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144731 00:07:41.763 12:49:45 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:41.763 12:49:45 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:41.763 12:49:45 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144731' 00:07:41.763 killing process with pid 144731 00:07:41.763 12:49:45 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 144731 00:07:41.763 12:49:45 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 144731 00:07:42.022 00:07:42.022 real 0m1.216s 00:07:42.022 user 0m1.176s 00:07:42.022 sys 0m0.602s 00:07:42.022 12:49:45 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:42.022 12:49:45 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:42.022 ************************************ 00:07:42.022 END TEST default_locks_via_rpc 00:07:42.022 ************************************ 00:07:42.022 12:49:45 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:42.022 12:49:45 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:42.022 12:49:45 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:42.022 12:49:45 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:42.281 ************************************ 00:07:42.282 START TEST non_locking_app_on_locked_coremask 00:07:42.282 ************************************ 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=144903 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 144903 /var/tmp/spdk.sock 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 144903 ']' 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:42.282 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:42.282 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:42.282 [2024-12-05 12:49:45.398606] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:42.282 [2024-12-05 12:49:45.398684] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid144903 ] 00:07:42.282 [2024-12-05 12:49:45.483841] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.282 [2024-12-05 12:49:45.506067] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=145065 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 145065 /var/tmp/spdk2.sock 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145065 ']' 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:42.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:42.541 12:49:45 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:42.541 [2024-12-05 12:49:45.730816] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:42.541 [2024-12-05 12:49:45.730919] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145065 ] 00:07:42.541 [2024-12-05 12:49:45.824633] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:42.541 [2024-12-05 12:49:45.824661] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.800 [2024-12-05 12:49:45.870864] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.060 12:49:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:43.060 12:49:46 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:43.060 12:49:46 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 144903 00:07:43.060 12:49:46 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 144903 00:07:43.060 12:49:46 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:44.438 lslocks: write error 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 144903 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 144903 ']' 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 144903 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 144903 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 144903' 00:07:44.438 killing process with pid 144903 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 144903 00:07:44.438 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 144903 00:07:44.698 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 145065 00:07:44.698 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 145065 ']' 00:07:44.698 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 145065 00:07:44.698 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:44.698 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:44.698 12:49:47 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145065 00:07:44.698 12:49:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:44.698 12:49:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:44.698 12:49:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145065' 00:07:44.698 killing process with pid 145065 00:07:44.698 12:49:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 145065 00:07:44.698 12:49:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 145065 00:07:45.267 00:07:45.267 real 0m2.914s 00:07:45.267 user 0m2.919s 00:07:45.267 sys 0m1.230s 00:07:45.267 12:49:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.267 12:49:48 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:45.267 ************************************ 00:07:45.267 END TEST non_locking_app_on_locked_coremask 00:07:45.267 ************************************ 00:07:45.267 12:49:48 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:45.267 12:49:48 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:45.267 12:49:48 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.267 12:49:48 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:45.267 ************************************ 00:07:45.267 START TEST locking_app_on_unlocked_coremask 00:07:45.267 ************************************ 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=145430 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 145430 /var/tmp/spdk.sock 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145430 ']' 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:45.267 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:45.267 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:45.267 [2024-12-05 12:49:48.395568] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:45.267 [2024-12-05 12:49:48.395632] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145430 ] 00:07:45.267 [2024-12-05 12:49:48.461570] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:45.267 [2024-12-05 12:49:48.461595] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.267 [2024-12-05 12:49:48.484172] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=145581 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 145581 /var/tmp/spdk2.sock 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145581 ']' 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:45.527 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:45.527 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:45.528 12:49:48 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:45.528 [2024-12-05 12:49:48.711124] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:45.528 [2024-12-05 12:49:48.711202] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145581 ] 00:07:45.528 [2024-12-05 12:49:48.808969] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.786 [2024-12-05 12:49:48.851428] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.044 12:49:49 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:46.044 12:49:49 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:46.044 12:49:49 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 145581 00:07:46.044 12:49:49 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 145581 00:07:46.044 12:49:49 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:46.983 lslocks: write error 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 145430 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 145430 ']' 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 145430 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145430 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145430' 00:07:46.983 killing process with pid 145430 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 145430 00:07:46.983 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 145430 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 145581 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 145581 ']' 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 145581 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145581 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145581' 00:07:47.553 killing process with pid 145581 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 145581 00:07:47.553 12:49:50 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 145581 00:07:48.123 00:07:48.123 real 0m2.766s 00:07:48.123 user 0m2.782s 00:07:48.123 sys 0m1.162s 00:07:48.123 12:49:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:48.123 12:49:51 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:48.123 ************************************ 00:07:48.123 END TEST locking_app_on_unlocked_coremask 00:07:48.123 ************************************ 00:07:48.123 12:49:51 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:48.124 12:49:51 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:48.124 12:49:51 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:48.124 12:49:51 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:48.124 ************************************ 00:07:48.124 START TEST locking_app_on_locked_coremask 00:07:48.124 ************************************ 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=145991 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 145991 /var/tmp/spdk.sock 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 145991 ']' 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:48.124 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:48.124 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:48.124 [2024-12-05 12:49:51.250764] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:48.124 [2024-12-05 12:49:51.250861] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid145991 ] 00:07:48.124 [2024-12-05 12:49:51.337991] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.124 [2024-12-05 12:49:51.358079] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=146043 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 146043 /var/tmp/spdk2.sock 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 146043 /var/tmp/spdk2.sock 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 146043 /var/tmp/spdk2.sock 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 146043 ']' 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:48.384 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:48.384 12:49:51 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:48.384 [2024-12-05 12:49:51.585780] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:48.384 [2024-12-05 12:49:51.585848] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146043 ] 00:07:48.384 [2024-12-05 12:49:51.683953] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 145991 has claimed it. 00:07:48.384 [2024-12-05 12:49:51.684000] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:48.955 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (146043) - No such process 00:07:48.955 ERROR: process (pid: 146043) is no longer running 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 145991 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 145991 00:07:48.955 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:49.897 lslocks: write error 00:07:49.897 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 145991 00:07:49.897 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 145991 ']' 00:07:49.897 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 145991 00:07:49.897 12:49:52 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:49.897 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:49.897 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 145991 00:07:49.897 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:49.897 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:49.897 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 145991' 00:07:49.897 killing process with pid 145991 00:07:49.897 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 145991 00:07:49.897 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 145991 00:07:50.157 00:07:50.157 real 0m2.126s 00:07:50.157 user 0m2.253s 00:07:50.157 sys 0m0.826s 00:07:50.157 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:50.157 12:49:53 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:50.157 ************************************ 00:07:50.157 END TEST locking_app_on_locked_coremask 00:07:50.157 ************************************ 00:07:50.157 12:49:53 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:50.157 12:49:53 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:50.157 12:49:53 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:50.157 12:49:53 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:50.157 ************************************ 00:07:50.157 START TEST locking_overlapped_coremask 00:07:50.157 ************************************ 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=146499 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 146499 /var/tmp/spdk.sock 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 146499 ']' 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:50.157 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:50.157 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:50.157 [2024-12-05 12:49:53.458982] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:50.157 [2024-12-05 12:49:53.459040] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146499 ] 00:07:50.417 [2024-12-05 12:49:53.544168] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:50.417 [2024-12-05 12:49:53.568453] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:50.417 [2024-12-05 12:49:53.568579] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.417 [2024-12-05 12:49:53.568580] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=146558 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 146558 /var/tmp/spdk2.sock 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 146558 /var/tmp/spdk2.sock 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 146558 /var/tmp/spdk2.sock 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 146558 ']' 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:50.679 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:50.679 12:49:53 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:50.679 [2024-12-05 12:49:53.800133] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:50.679 [2024-12-05 12:49:53.800217] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146558 ] 00:07:50.679 [2024-12-05 12:49:53.899321] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 146499 has claimed it. 00:07:50.679 [2024-12-05 12:49:53.899359] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:51.250 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (146558) - No such process 00:07:51.250 ERROR: process (pid: 146558) is no longer running 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 146499 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 146499 ']' 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 146499 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 146499 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 146499' 00:07:51.250 killing process with pid 146499 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 146499 00:07:51.250 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 146499 00:07:51.510 00:07:51.510 real 0m1.382s 00:07:51.510 user 0m3.844s 00:07:51.510 sys 0m0.436s 00:07:51.511 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:51.511 12:49:54 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:51.511 ************************************ 00:07:51.511 END TEST locking_overlapped_coremask 00:07:51.511 ************************************ 00:07:51.771 12:49:54 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:51.771 12:49:54 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:51.771 12:49:54 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:51.771 12:49:54 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:51.771 ************************************ 00:07:51.771 START TEST locking_overlapped_coremask_via_rpc 00:07:51.771 ************************************ 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=146758 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 146758 /var/tmp/spdk.sock 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146758 ']' 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:51.771 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:51.771 12:49:54 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:51.771 [2024-12-05 12:49:54.929207] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:51.771 [2024-12-05 12:49:54.929284] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146758 ] 00:07:51.771 [2024-12-05 12:49:55.013293] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:51.771 [2024-12-05 12:49:55.013320] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:51.771 [2024-12-05 12:49:55.037248] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:51.771 [2024-12-05 12:49:55.037357] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.771 [2024-12-05 12:49:55.037358] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=146857 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 146857 /var/tmp/spdk2.sock 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146857 ']' 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:52.031 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:52.031 12:49:55 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:52.032 [2024-12-05 12:49:55.277559] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:52.032 [2024-12-05 12:49:55.277646] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid146857 ] 00:07:52.291 [2024-12-05 12:49:55.379059] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:52.291 [2024-12-05 12:49:55.379090] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:52.291 [2024-12-05 12:49:55.427671] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:52.291 [2024-12-05 12:49:55.427788] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:52.291 [2024-12-05 12:49:55.427790] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:52.861 [2024-12-05 12:49:56.142898] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 146758 has claimed it. 00:07:52.861 request: 00:07:52.861 { 00:07:52.861 "method": "framework_enable_cpumask_locks", 00:07:52.861 "req_id": 1 00:07:52.861 } 00:07:52.861 Got JSON-RPC error response 00:07:52.861 response: 00:07:52.861 { 00:07:52.861 "code": -32603, 00:07:52.861 "message": "Failed to claim CPU core: 2" 00:07:52.861 } 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:52.861 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 146758 /var/tmp/spdk.sock 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146758 ']' 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:52.862 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:52.862 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 146857 /var/tmp/spdk2.sock 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 146857 ']' 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:53.121 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:53.121 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:53.382 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:53.382 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:53.382 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:53.382 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:53.382 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:53.382 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:53.382 00:07:53.382 real 0m1.660s 00:07:53.382 user 0m0.780s 00:07:53.382 sys 0m0.169s 00:07:53.382 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:53.382 12:49:56 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:53.382 ************************************ 00:07:53.382 END TEST locking_overlapped_coremask_via_rpc 00:07:53.382 ************************************ 00:07:53.382 12:49:56 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:53.382 12:49:56 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 146758 ]] 00:07:53.382 12:49:56 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 146758 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146758 ']' 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146758 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 146758 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 146758' 00:07:53.382 killing process with pid 146758 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 146758 00:07:53.382 12:49:56 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 146758 00:07:53.952 12:49:56 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 146857 ]] 00:07:53.952 12:49:56 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 146857 00:07:53.952 12:49:56 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146857 ']' 00:07:53.952 12:49:56 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146857 00:07:53.952 12:49:56 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:53.952 12:49:56 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:53.952 12:49:56 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 146857 00:07:53.952 12:49:57 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:53.952 12:49:57 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:53.952 12:49:57 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 146857' 00:07:53.952 killing process with pid 146857 00:07:53.952 12:49:57 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 146857 00:07:53.952 12:49:57 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 146857 00:07:54.213 12:49:57 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:54.213 12:49:57 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:54.213 12:49:57 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 146758 ]] 00:07:54.213 12:49:57 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 146758 00:07:54.213 12:49:57 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146758 ']' 00:07:54.213 12:49:57 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146758 00:07:54.213 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (146758) - No such process 00:07:54.213 12:49:57 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 146758 is not found' 00:07:54.213 Process with pid 146758 is not found 00:07:54.213 12:49:57 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 146857 ]] 00:07:54.213 12:49:57 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 146857 00:07:54.213 12:49:57 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 146857 ']' 00:07:54.213 12:49:57 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 146857 00:07:54.213 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (146857) - No such process 00:07:54.213 12:49:57 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 146857 is not found' 00:07:54.213 Process with pid 146857 is not found 00:07:54.213 12:49:57 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:54.213 00:07:54.213 real 0m14.910s 00:07:54.213 user 0m24.859s 00:07:54.213 sys 0m6.197s 00:07:54.213 12:49:57 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:54.213 12:49:57 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:54.213 ************************************ 00:07:54.213 END TEST cpu_locks 00:07:54.213 ************************************ 00:07:54.213 00:07:54.213 real 0m39.962s 00:07:54.213 user 1m14.697s 00:07:54.213 sys 0m10.590s 00:07:54.213 12:49:57 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:54.213 12:49:57 event -- common/autotest_common.sh@10 -- # set +x 00:07:54.213 ************************************ 00:07:54.213 END TEST event 00:07:54.213 ************************************ 00:07:54.213 12:49:57 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:54.213 12:49:57 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:54.213 12:49:57 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:54.213 12:49:57 -- common/autotest_common.sh@10 -- # set +x 00:07:54.213 ************************************ 00:07:54.213 START TEST thread 00:07:54.213 ************************************ 00:07:54.213 12:49:57 thread -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:54.474 * Looking for test storage... 00:07:54.474 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:54.474 12:49:57 thread -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:54.474 12:49:57 thread -- common/autotest_common.sh@1711 -- # lcov --version 00:07:54.474 12:49:57 thread -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:54.474 12:49:57 thread -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:54.474 12:49:57 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:54.474 12:49:57 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:54.474 12:49:57 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:54.474 12:49:57 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:54.474 12:49:57 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:54.474 12:49:57 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:54.474 12:49:57 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:54.474 12:49:57 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:54.474 12:49:57 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:54.474 12:49:57 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:54.474 12:49:57 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:54.474 12:49:57 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:54.474 12:49:57 thread -- scripts/common.sh@345 -- # : 1 00:07:54.474 12:49:57 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:54.474 12:49:57 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:54.474 12:49:57 thread -- scripts/common.sh@365 -- # decimal 1 00:07:54.474 12:49:57 thread -- scripts/common.sh@353 -- # local d=1 00:07:54.474 12:49:57 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:54.474 12:49:57 thread -- scripts/common.sh@355 -- # echo 1 00:07:54.475 12:49:57 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:54.475 12:49:57 thread -- scripts/common.sh@366 -- # decimal 2 00:07:54.475 12:49:57 thread -- scripts/common.sh@353 -- # local d=2 00:07:54.475 12:49:57 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:54.475 12:49:57 thread -- scripts/common.sh@355 -- # echo 2 00:07:54.475 12:49:57 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:54.475 12:49:57 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:54.475 12:49:57 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:54.475 12:49:57 thread -- scripts/common.sh@368 -- # return 0 00:07:54.475 12:49:57 thread -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:54.475 12:49:57 thread -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:54.475 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.475 --rc genhtml_branch_coverage=1 00:07:54.475 --rc genhtml_function_coverage=1 00:07:54.475 --rc genhtml_legend=1 00:07:54.475 --rc geninfo_all_blocks=1 00:07:54.475 --rc geninfo_unexecuted_blocks=1 00:07:54.475 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.475 ' 00:07:54.475 12:49:57 thread -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:54.475 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.475 --rc genhtml_branch_coverage=1 00:07:54.475 --rc genhtml_function_coverage=1 00:07:54.475 --rc genhtml_legend=1 00:07:54.475 --rc geninfo_all_blocks=1 00:07:54.475 --rc geninfo_unexecuted_blocks=1 00:07:54.475 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.475 ' 00:07:54.475 12:49:57 thread -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:54.475 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.475 --rc genhtml_branch_coverage=1 00:07:54.475 --rc genhtml_function_coverage=1 00:07:54.475 --rc genhtml_legend=1 00:07:54.475 --rc geninfo_all_blocks=1 00:07:54.475 --rc geninfo_unexecuted_blocks=1 00:07:54.475 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.475 ' 00:07:54.475 12:49:57 thread -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:54.475 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:54.475 --rc genhtml_branch_coverage=1 00:07:54.475 --rc genhtml_function_coverage=1 00:07:54.475 --rc genhtml_legend=1 00:07:54.475 --rc geninfo_all_blocks=1 00:07:54.475 --rc geninfo_unexecuted_blocks=1 00:07:54.475 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:54.475 ' 00:07:54.475 12:49:57 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:54.475 12:49:57 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:54.475 12:49:57 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:54.475 12:49:57 thread -- common/autotest_common.sh@10 -- # set +x 00:07:54.475 ************************************ 00:07:54.475 START TEST thread_poller_perf 00:07:54.475 ************************************ 00:07:54.475 12:49:57 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:54.475 [2024-12-05 12:49:57.717405] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:54.475 [2024-12-05 12:49:57.717490] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147263 ] 00:07:54.735 [2024-12-05 12:49:57.802564] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.735 [2024-12-05 12:49:57.824317] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.735 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:55.677 [2024-12-05T11:49:58.991Z] ====================================== 00:07:55.677 [2024-12-05T11:49:58.991Z] busy:2504453944 (cyc) 00:07:55.677 [2024-12-05T11:49:58.991Z] total_run_count: 868000 00:07:55.677 [2024-12-05T11:49:58.991Z] tsc_hz: 2500000000 (cyc) 00:07:55.677 [2024-12-05T11:49:58.991Z] ====================================== 00:07:55.677 [2024-12-05T11:49:58.991Z] poller_cost: 2885 (cyc), 1154 (nsec) 00:07:55.677 00:07:55.677 real 0m1.155s 00:07:55.677 user 0m1.061s 00:07:55.677 sys 0m0.090s 00:07:55.677 12:49:58 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:55.677 12:49:58 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:55.677 ************************************ 00:07:55.677 END TEST thread_poller_perf 00:07:55.677 ************************************ 00:07:55.677 12:49:58 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:55.677 12:49:58 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:55.677 12:49:58 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:55.677 12:49:58 thread -- common/autotest_common.sh@10 -- # set +x 00:07:55.677 ************************************ 00:07:55.677 START TEST thread_poller_perf 00:07:55.677 ************************************ 00:07:55.677 12:49:58 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:55.677 [2024-12-05 12:49:58.954781] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:55.677 [2024-12-05 12:49:58.954888] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147531 ] 00:07:55.937 [2024-12-05 12:49:59.042556] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.937 [2024-12-05 12:49:59.064725] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.937 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:56.876 [2024-12-05T11:50:00.190Z] ====================================== 00:07:56.876 [2024-12-05T11:50:00.190Z] busy:2501456934 (cyc) 00:07:56.876 [2024-12-05T11:50:00.190Z] total_run_count: 13436000 00:07:56.876 [2024-12-05T11:50:00.190Z] tsc_hz: 2500000000 (cyc) 00:07:56.876 [2024-12-05T11:50:00.190Z] ====================================== 00:07:56.876 [2024-12-05T11:50:00.190Z] poller_cost: 186 (cyc), 74 (nsec) 00:07:56.876 00:07:56.876 real 0m1.161s 00:07:56.876 user 0m1.063s 00:07:56.876 sys 0m0.094s 00:07:56.876 12:50:00 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:56.876 12:50:00 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:56.876 ************************************ 00:07:56.876 END TEST thread_poller_perf 00:07:56.876 ************************************ 00:07:56.876 12:50:00 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:56.876 12:50:00 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:56.876 12:50:00 thread -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:56.876 12:50:00 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:56.876 12:50:00 thread -- common/autotest_common.sh@10 -- # set +x 00:07:56.876 ************************************ 00:07:56.876 START TEST thread_spdk_lock 00:07:56.876 ************************************ 00:07:56.876 12:50:00 thread.thread_spdk_lock -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:57.136 [2024-12-05 12:50:00.202311] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:57.136 [2024-12-05 12:50:00.202421] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid147811 ] 00:07:57.136 [2024-12-05 12:50:00.291708] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:57.136 [2024-12-05 12:50:00.319082] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.136 [2024-12-05 12:50:00.319081] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:57.707 [2024-12-05 12:50:00.820730] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:57.707 [2024-12-05 12:50:00.820764] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3112:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:57.707 [2024-12-05 12:50:00.820775] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3067:sspin_stacks_print: *ERROR*: spinlock 0x1327280 00:07:57.707 [2024-12-05 12:50:00.821411] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:57.707 [2024-12-05 12:50:00.821514] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:57.707 [2024-12-05 12:50:00.821533] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:57.707 Starting test contend 00:07:57.707 Worker Delay Wait us Hold us Total us 00:07:57.707 0 3 159034 190622 349656 00:07:57.707 1 5 80512 290808 371320 00:07:57.707 PASS test contend 00:07:57.707 Starting test hold_by_poller 00:07:57.707 PASS test hold_by_poller 00:07:57.707 Starting test hold_by_message 00:07:57.707 PASS test hold_by_message 00:07:57.707 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:57.707 100014 assertions passed 00:07:57.707 0 assertions failed 00:07:57.707 00:07:57.707 real 0m0.664s 00:07:57.707 user 0m1.067s 00:07:57.707 sys 0m0.096s 00:07:57.707 12:50:00 thread.thread_spdk_lock -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:57.707 12:50:00 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:57.707 ************************************ 00:07:57.707 END TEST thread_spdk_lock 00:07:57.707 ************************************ 00:07:57.707 00:07:57.707 real 0m3.425s 00:07:57.707 user 0m3.400s 00:07:57.707 sys 0m0.554s 00:07:57.707 12:50:00 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:57.707 12:50:00 thread -- common/autotest_common.sh@10 -- # set +x 00:07:57.707 ************************************ 00:07:57.707 END TEST thread 00:07:57.707 ************************************ 00:07:57.707 12:50:00 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:57.707 12:50:00 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:57.707 12:50:00 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:57.707 12:50:00 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:57.707 12:50:00 -- common/autotest_common.sh@10 -- # set +x 00:07:57.707 ************************************ 00:07:57.707 START TEST app_cmdline 00:07:57.707 ************************************ 00:07:57.707 12:50:00 app_cmdline -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:57.967 * Looking for test storage... 00:07:57.967 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1711 -- # lcov --version 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:57.967 12:50:01 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:57.967 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:57.967 --rc genhtml_branch_coverage=1 00:07:57.967 --rc genhtml_function_coverage=1 00:07:57.967 --rc genhtml_legend=1 00:07:57.967 --rc geninfo_all_blocks=1 00:07:57.967 --rc geninfo_unexecuted_blocks=1 00:07:57.967 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:57.967 ' 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:57.967 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:57.967 --rc genhtml_branch_coverage=1 00:07:57.967 --rc genhtml_function_coverage=1 00:07:57.967 --rc genhtml_legend=1 00:07:57.967 --rc geninfo_all_blocks=1 00:07:57.967 --rc geninfo_unexecuted_blocks=1 00:07:57.967 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:57.967 ' 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:57.967 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:57.967 --rc genhtml_branch_coverage=1 00:07:57.967 --rc genhtml_function_coverage=1 00:07:57.967 --rc genhtml_legend=1 00:07:57.967 --rc geninfo_all_blocks=1 00:07:57.967 --rc geninfo_unexecuted_blocks=1 00:07:57.967 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:57.967 ' 00:07:57.967 12:50:01 app_cmdline -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:57.967 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:57.967 --rc genhtml_branch_coverage=1 00:07:57.968 --rc genhtml_function_coverage=1 00:07:57.968 --rc genhtml_legend=1 00:07:57.968 --rc geninfo_all_blocks=1 00:07:57.968 --rc geninfo_unexecuted_blocks=1 00:07:57.968 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:57.968 ' 00:07:57.968 12:50:01 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:57.968 12:50:01 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=148087 00:07:57.968 12:50:01 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 148087 00:07:57.968 12:50:01 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:57.968 12:50:01 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 148087 ']' 00:07:57.968 12:50:01 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:57.968 12:50:01 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:57.968 12:50:01 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:57.968 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:57.968 12:50:01 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:57.968 12:50:01 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:57.968 [2024-12-05 12:50:01.198059] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:07:57.968 [2024-12-05 12:50:01.198133] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148087 ] 00:07:58.227 [2024-12-05 12:50:01.283665] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.227 [2024-12-05 12:50:01.305628] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.227 12:50:01 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:58.227 12:50:01 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:07:58.227 12:50:01 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:58.486 { 00:07:58.486 "version": "SPDK v25.01-pre git sha1 8d3947977", 00:07:58.486 "fields": { 00:07:58.486 "major": 25, 00:07:58.486 "minor": 1, 00:07:58.486 "patch": 0, 00:07:58.486 "suffix": "-pre", 00:07:58.486 "commit": "8d3947977" 00:07:58.486 } 00:07:58.486 } 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:58.486 12:50:01 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:58.486 12:50:01 app_cmdline -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:58.487 12:50:01 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:58.487 12:50:01 app_cmdline -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:58.747 request: 00:07:58.747 { 00:07:58.747 "method": "env_dpdk_get_mem_stats", 00:07:58.747 "req_id": 1 00:07:58.747 } 00:07:58.747 Got JSON-RPC error response 00:07:58.747 response: 00:07:58.747 { 00:07:58.747 "code": -32601, 00:07:58.747 "message": "Method not found" 00:07:58.747 } 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:58.747 12:50:01 app_cmdline -- app/cmdline.sh@1 -- # killprocess 148087 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 148087 ']' 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 148087 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:58.747 12:50:01 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 148087 00:07:58.747 12:50:02 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:58.747 12:50:02 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:58.747 12:50:02 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 148087' 00:07:58.747 killing process with pid 148087 00:07:58.747 12:50:02 app_cmdline -- common/autotest_common.sh@973 -- # kill 148087 00:07:58.747 12:50:02 app_cmdline -- common/autotest_common.sh@978 -- # wait 148087 00:07:59.007 00:07:59.007 real 0m1.306s 00:07:59.007 user 0m1.497s 00:07:59.007 sys 0m0.505s 00:07:59.007 12:50:02 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:59.007 12:50:02 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:59.007 ************************************ 00:07:59.007 END TEST app_cmdline 00:07:59.007 ************************************ 00:07:59.267 12:50:02 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:59.267 12:50:02 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:59.267 12:50:02 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:59.267 12:50:02 -- common/autotest_common.sh@10 -- # set +x 00:07:59.267 ************************************ 00:07:59.267 START TEST version 00:07:59.267 ************************************ 00:07:59.267 12:50:02 version -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:59.267 * Looking for test storage... 00:07:59.267 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:59.267 12:50:02 version -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:59.267 12:50:02 version -- common/autotest_common.sh@1711 -- # lcov --version 00:07:59.267 12:50:02 version -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:59.267 12:50:02 version -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:59.267 12:50:02 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:59.267 12:50:02 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:59.267 12:50:02 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:59.267 12:50:02 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:59.267 12:50:02 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:59.267 12:50:02 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:59.267 12:50:02 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:59.267 12:50:02 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:59.267 12:50:02 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:59.267 12:50:02 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:59.267 12:50:02 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:59.267 12:50:02 version -- scripts/common.sh@344 -- # case "$op" in 00:07:59.267 12:50:02 version -- scripts/common.sh@345 -- # : 1 00:07:59.268 12:50:02 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:59.268 12:50:02 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:59.268 12:50:02 version -- scripts/common.sh@365 -- # decimal 1 00:07:59.268 12:50:02 version -- scripts/common.sh@353 -- # local d=1 00:07:59.268 12:50:02 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:59.268 12:50:02 version -- scripts/common.sh@355 -- # echo 1 00:07:59.268 12:50:02 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:59.268 12:50:02 version -- scripts/common.sh@366 -- # decimal 2 00:07:59.268 12:50:02 version -- scripts/common.sh@353 -- # local d=2 00:07:59.268 12:50:02 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:59.268 12:50:02 version -- scripts/common.sh@355 -- # echo 2 00:07:59.268 12:50:02 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:59.268 12:50:02 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:59.268 12:50:02 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:59.268 12:50:02 version -- scripts/common.sh@368 -- # return 0 00:07:59.268 12:50:02 version -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:59.268 12:50:02 version -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:59.268 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.268 --rc genhtml_branch_coverage=1 00:07:59.268 --rc genhtml_function_coverage=1 00:07:59.268 --rc genhtml_legend=1 00:07:59.268 --rc geninfo_all_blocks=1 00:07:59.268 --rc geninfo_unexecuted_blocks=1 00:07:59.268 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.268 ' 00:07:59.268 12:50:02 version -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:59.268 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.268 --rc genhtml_branch_coverage=1 00:07:59.268 --rc genhtml_function_coverage=1 00:07:59.268 --rc genhtml_legend=1 00:07:59.268 --rc geninfo_all_blocks=1 00:07:59.268 --rc geninfo_unexecuted_blocks=1 00:07:59.268 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.268 ' 00:07:59.268 12:50:02 version -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:59.268 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.268 --rc genhtml_branch_coverage=1 00:07:59.268 --rc genhtml_function_coverage=1 00:07:59.268 --rc genhtml_legend=1 00:07:59.268 --rc geninfo_all_blocks=1 00:07:59.268 --rc geninfo_unexecuted_blocks=1 00:07:59.268 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.268 ' 00:07:59.268 12:50:02 version -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:59.268 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.268 --rc genhtml_branch_coverage=1 00:07:59.268 --rc genhtml_function_coverage=1 00:07:59.268 --rc genhtml_legend=1 00:07:59.268 --rc geninfo_all_blocks=1 00:07:59.268 --rc geninfo_unexecuted_blocks=1 00:07:59.268 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.268 ' 00:07:59.268 12:50:02 version -- app/version.sh@17 -- # get_header_version major 00:07:59.268 12:50:02 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:59.268 12:50:02 version -- app/version.sh@14 -- # cut -f2 00:07:59.268 12:50:02 version -- app/version.sh@14 -- # tr -d '"' 00:07:59.268 12:50:02 version -- app/version.sh@17 -- # major=25 00:07:59.268 12:50:02 version -- app/version.sh@18 -- # get_header_version minor 00:07:59.268 12:50:02 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:59.268 12:50:02 version -- app/version.sh@14 -- # cut -f2 00:07:59.268 12:50:02 version -- app/version.sh@14 -- # tr -d '"' 00:07:59.529 12:50:02 version -- app/version.sh@18 -- # minor=1 00:07:59.529 12:50:02 version -- app/version.sh@19 -- # get_header_version patch 00:07:59.529 12:50:02 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:59.529 12:50:02 version -- app/version.sh@14 -- # cut -f2 00:07:59.529 12:50:02 version -- app/version.sh@14 -- # tr -d '"' 00:07:59.529 12:50:02 version -- app/version.sh@19 -- # patch=0 00:07:59.529 12:50:02 version -- app/version.sh@20 -- # get_header_version suffix 00:07:59.529 12:50:02 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:59.529 12:50:02 version -- app/version.sh@14 -- # cut -f2 00:07:59.529 12:50:02 version -- app/version.sh@14 -- # tr -d '"' 00:07:59.529 12:50:02 version -- app/version.sh@20 -- # suffix=-pre 00:07:59.529 12:50:02 version -- app/version.sh@22 -- # version=25.1 00:07:59.529 12:50:02 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:59.529 12:50:02 version -- app/version.sh@28 -- # version=25.1rc0 00:07:59.529 12:50:02 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:59.529 12:50:02 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:59.529 12:50:02 version -- app/version.sh@30 -- # py_version=25.1rc0 00:07:59.529 12:50:02 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:07:59.529 00:07:59.529 real 0m0.280s 00:07:59.529 user 0m0.165s 00:07:59.529 sys 0m0.172s 00:07:59.529 12:50:02 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:59.529 12:50:02 version -- common/autotest_common.sh@10 -- # set +x 00:07:59.529 ************************************ 00:07:59.529 END TEST version 00:07:59.529 ************************************ 00:07:59.529 12:50:02 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:59.529 12:50:02 -- spdk/autotest.sh@194 -- # uname -s 00:07:59.529 12:50:02 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:59.529 12:50:02 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:59.529 12:50:02 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:59.529 12:50:02 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@260 -- # timing_exit lib 00:07:59.529 12:50:02 -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:59.529 12:50:02 -- common/autotest_common.sh@10 -- # set +x 00:07:59.529 12:50:02 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:59.529 12:50:02 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:59.529 12:50:02 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:59.529 12:50:02 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:59.529 12:50:02 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:59.529 12:50:02 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:59.529 12:50:02 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:59.529 12:50:02 -- common/autotest_common.sh@10 -- # set +x 00:07:59.529 ************************************ 00:07:59.529 START TEST llvm_fuzz 00:07:59.529 ************************************ 00:07:59.529 12:50:02 llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:59.790 * Looking for test storage... 00:07:59.790 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:59.790 12:50:02 llvm_fuzz -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:59.790 12:50:02 llvm_fuzz -- common/autotest_common.sh@1711 -- # lcov --version 00:07:59.790 12:50:02 llvm_fuzz -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:59.790 12:50:02 llvm_fuzz -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:59.790 12:50:02 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:59.791 12:50:02 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:59.791 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.791 --rc genhtml_branch_coverage=1 00:07:59.791 --rc genhtml_function_coverage=1 00:07:59.791 --rc genhtml_legend=1 00:07:59.791 --rc geninfo_all_blocks=1 00:07:59.791 --rc geninfo_unexecuted_blocks=1 00:07:59.791 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.791 ' 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:59.791 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.791 --rc genhtml_branch_coverage=1 00:07:59.791 --rc genhtml_function_coverage=1 00:07:59.791 --rc genhtml_legend=1 00:07:59.791 --rc geninfo_all_blocks=1 00:07:59.791 --rc geninfo_unexecuted_blocks=1 00:07:59.791 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.791 ' 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:59.791 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.791 --rc genhtml_branch_coverage=1 00:07:59.791 --rc genhtml_function_coverage=1 00:07:59.791 --rc genhtml_legend=1 00:07:59.791 --rc geninfo_all_blocks=1 00:07:59.791 --rc geninfo_unexecuted_blocks=1 00:07:59.791 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.791 ' 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:59.791 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:59.791 --rc genhtml_branch_coverage=1 00:07:59.791 --rc genhtml_function_coverage=1 00:07:59.791 --rc genhtml_legend=1 00:07:59.791 --rc geninfo_all_blocks=1 00:07:59.791 --rc geninfo_unexecuted_blocks=1 00:07:59.791 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:59.791 ' 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@550 -- # fuzzers=() 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@550 -- # local fuzzers 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@552 -- # [[ -n '' ]] 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@555 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@556 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@559 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:59.791 12:50:02 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:59.791 12:50:02 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:59.791 ************************************ 00:07:59.791 START TEST nvmf_llvm_fuzz 00:07:59.791 ************************************ 00:07:59.791 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:00.057 * Looking for test storage... 00:08:00.057 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.057 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:08:00.057 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1711 -- # lcov --version 00:08:00.057 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:08:00.058 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.058 --rc genhtml_branch_coverage=1 00:08:00.058 --rc genhtml_function_coverage=1 00:08:00.058 --rc genhtml_legend=1 00:08:00.058 --rc geninfo_all_blocks=1 00:08:00.058 --rc geninfo_unexecuted_blocks=1 00:08:00.058 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.058 ' 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:08:00.058 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.058 --rc genhtml_branch_coverage=1 00:08:00.058 --rc genhtml_function_coverage=1 00:08:00.058 --rc genhtml_legend=1 00:08:00.058 --rc geninfo_all_blocks=1 00:08:00.058 --rc geninfo_unexecuted_blocks=1 00:08:00.058 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.058 ' 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:08:00.058 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.058 --rc genhtml_branch_coverage=1 00:08:00.058 --rc genhtml_function_coverage=1 00:08:00.058 --rc genhtml_legend=1 00:08:00.058 --rc geninfo_all_blocks=1 00:08:00.058 --rc geninfo_unexecuted_blocks=1 00:08:00.058 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.058 ' 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:08:00.058 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.058 --rc genhtml_branch_coverage=1 00:08:00.058 --rc genhtml_function_coverage=1 00:08:00.058 --rc genhtml_legend=1 00:08:00.058 --rc geninfo_all_blocks=1 00:08:00.058 --rc geninfo_unexecuted_blocks=1 00:08:00.058 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.058 ' 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:00.058 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:00.059 #define SPDK_CONFIG_H 00:08:00.059 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:00.059 #define SPDK_CONFIG_APPS 1 00:08:00.059 #define SPDK_CONFIG_ARCH native 00:08:00.059 #undef SPDK_CONFIG_ASAN 00:08:00.059 #undef SPDK_CONFIG_AVAHI 00:08:00.059 #undef SPDK_CONFIG_CET 00:08:00.059 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:00.059 #define SPDK_CONFIG_COVERAGE 1 00:08:00.059 #define SPDK_CONFIG_CROSS_PREFIX 00:08:00.059 #undef SPDK_CONFIG_CRYPTO 00:08:00.059 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:00.059 #undef SPDK_CONFIG_CUSTOMOCF 00:08:00.059 #undef SPDK_CONFIG_DAOS 00:08:00.059 #define SPDK_CONFIG_DAOS_DIR 00:08:00.059 #define SPDK_CONFIG_DEBUG 1 00:08:00.059 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:00.059 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:00.059 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:00.059 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:00.059 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:00.059 #undef SPDK_CONFIG_DPDK_UADK 00:08:00.059 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:00.059 #define SPDK_CONFIG_EXAMPLES 1 00:08:00.059 #undef SPDK_CONFIG_FC 00:08:00.059 #define SPDK_CONFIG_FC_PATH 00:08:00.059 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:00.059 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:00.059 #define SPDK_CONFIG_FSDEV 1 00:08:00.059 #undef SPDK_CONFIG_FUSE 00:08:00.059 #define SPDK_CONFIG_FUZZER 1 00:08:00.059 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:00.059 #undef SPDK_CONFIG_GOLANG 00:08:00.059 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:00.059 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:00.059 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:00.059 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:00.059 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:00.059 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:00.059 #undef SPDK_CONFIG_HAVE_LZ4 00:08:00.059 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:00.059 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:00.059 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:00.059 #define SPDK_CONFIG_IDXD 1 00:08:00.059 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:00.059 #undef SPDK_CONFIG_IPSEC_MB 00:08:00.059 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:00.059 #define SPDK_CONFIG_ISAL 1 00:08:00.059 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:00.059 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:00.059 #define SPDK_CONFIG_LIBDIR 00:08:00.059 #undef SPDK_CONFIG_LTO 00:08:00.059 #define SPDK_CONFIG_MAX_LCORES 128 00:08:00.059 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:00.059 #define SPDK_CONFIG_NVME_CUSE 1 00:08:00.059 #undef SPDK_CONFIG_OCF 00:08:00.059 #define SPDK_CONFIG_OCF_PATH 00:08:00.059 #define SPDK_CONFIG_OPENSSL_PATH 00:08:00.059 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:00.059 #define SPDK_CONFIG_PGO_DIR 00:08:00.059 #undef SPDK_CONFIG_PGO_USE 00:08:00.059 #define SPDK_CONFIG_PREFIX /usr/local 00:08:00.059 #undef SPDK_CONFIG_RAID5F 00:08:00.059 #undef SPDK_CONFIG_RBD 00:08:00.059 #define SPDK_CONFIG_RDMA 1 00:08:00.059 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:00.059 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:00.059 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:00.059 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:00.059 #undef SPDK_CONFIG_SHARED 00:08:00.059 #undef SPDK_CONFIG_SMA 00:08:00.059 #define SPDK_CONFIG_TESTS 1 00:08:00.059 #undef SPDK_CONFIG_TSAN 00:08:00.059 #define SPDK_CONFIG_UBLK 1 00:08:00.059 #define SPDK_CONFIG_UBSAN 1 00:08:00.059 #undef SPDK_CONFIG_UNIT_TESTS 00:08:00.059 #undef SPDK_CONFIG_URING 00:08:00.059 #define SPDK_CONFIG_URING_PATH 00:08:00.059 #undef SPDK_CONFIG_URING_ZNS 00:08:00.059 #undef SPDK_CONFIG_USDT 00:08:00.059 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:00.059 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:00.059 #define SPDK_CONFIG_VFIO_USER 1 00:08:00.059 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:00.059 #define SPDK_CONFIG_VHOST 1 00:08:00.059 #define SPDK_CONFIG_VIRTIO 1 00:08:00.059 #undef SPDK_CONFIG_VTUNE 00:08:00.059 #define SPDK_CONFIG_VTUNE_DIR 00:08:00.059 #define SPDK_CONFIG_WERROR 1 00:08:00.059 #define SPDK_CONFIG_WPDK_DIR 00:08:00.059 #undef SPDK_CONFIG_XNVME 00:08:00.059 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:00.059 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:00.060 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:08:00.061 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 148579 ]] 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 148579 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1696 -- # set_test_storage 2147483648 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.SjJ5WK 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.SjJ5WK/tests/nvmf /tmp/spdk.SjJ5WK 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:08:00.062 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=52729032704 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730582528 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=9001549824 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:00.323 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30861860864 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865289216 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340121600 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346118144 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5996544 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30864977920 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865293312 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=315392 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:08:00.324 * Looking for test storage... 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=52729032704 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=11216142336 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.324 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1698 -- # set -o errtrace 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1699 -- # shopt -s extdebug 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1700 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1702 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1703 -- # true 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1705 -- # xtrace_fd 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1711 -- # lcov --version 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:08:00.324 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.324 --rc genhtml_branch_coverage=1 00:08:00.324 --rc genhtml_function_coverage=1 00:08:00.324 --rc genhtml_legend=1 00:08:00.324 --rc geninfo_all_blocks=1 00:08:00.324 --rc geninfo_unexecuted_blocks=1 00:08:00.324 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.324 ' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:08:00.324 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.324 --rc genhtml_branch_coverage=1 00:08:00.324 --rc genhtml_function_coverage=1 00:08:00.324 --rc genhtml_legend=1 00:08:00.324 --rc geninfo_all_blocks=1 00:08:00.324 --rc geninfo_unexecuted_blocks=1 00:08:00.324 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.324 ' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:08:00.324 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.324 --rc genhtml_branch_coverage=1 00:08:00.324 --rc genhtml_function_coverage=1 00:08:00.324 --rc genhtml_legend=1 00:08:00.324 --rc geninfo_all_blocks=1 00:08:00.324 --rc geninfo_unexecuted_blocks=1 00:08:00.324 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.324 ' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:08:00.324 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:00.324 --rc genhtml_branch_coverage=1 00:08:00.324 --rc genhtml_function_coverage=1 00:08:00.324 --rc genhtml_legend=1 00:08:00.324 --rc geninfo_all_blocks=1 00:08:00.324 --rc geninfo_unexecuted_blocks=1 00:08:00.324 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:00.324 ' 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:00.324 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.325 12:50:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:08:00.325 [2024-12-05 12:50:03.546528] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:00.325 [2024-12-05 12:50:03.546593] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid148644 ] 00:08:00.585 [2024-12-05 12:50:03.740811] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.585 [2024-12-05 12:50:03.753006] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.585 [2024-12-05 12:50:03.805347] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.585 [2024-12-05 12:50:03.821683] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:08:00.585 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.585 INFO: Seed: 3883447551 00:08:00.585 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:00.585 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:00.585 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:00.585 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.585 #2 INITED exec/s: 0 rss: 64Mb 00:08:00.585 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.585 This may also happen if the target rejected all inputs we tried so far 00:08:00.585 [2024-12-05 12:50:03.880494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:00.585 [2024-12-05 12:50:03.880523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.585 [2024-12-05 12:50:03.880585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:00.585 [2024-12-05 12:50:03.880603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.105 NEW_FUNC[1/716]: 0x4527a8 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:08:01.105 NEW_FUNC[2/716]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.105 #39 NEW cov: 12142 ft: 12114 corp: 2/137b lim: 320 exec/s: 0 rss: 72Mb L: 136/136 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:01.105 [2024-12-05 12:50:04.231524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:01.105 [2024-12-05 12:50:04.231579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.105 NEW_FUNC[1/1]: 0x1517488 in nvmf_tgroup_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:576 00:08:01.105 #44 NEW cov: 12267 ft: 13125 corp: 3/212b lim: 320 exec/s: 0 rss: 72Mb L: 75/136 MS: 5 CopyPart-ChangeByte-CrossOver-CrossOver-InsertRepeatedBytes- 00:08:01.105 [2024-12-05 12:50:04.281346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.105 [2024-12-05 12:50:04.281372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.105 [2024-12-05 12:50:04.281436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.105 [2024-12-05 12:50:04.281454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.105 #45 NEW cov: 12273 ft: 13342 corp: 4/348b lim: 320 exec/s: 0 rss: 72Mb L: 136/136 MS: 1 CMP- DE: "T\234@\002\000\000\000\000"- 00:08:01.105 [2024-12-05 12:50:04.341563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:4232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.105 [2024-12-05 12:50:04.341590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.105 [2024-12-05 12:50:04.341652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.105 [2024-12-05 12:50:04.341671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.105 #47 NEW cov: 12358 ft: 13626 corp: 5/480b lim: 320 exec/s: 0 rss: 72Mb L: 132/136 MS: 2 CMP-CrossOver- DE: "\377\377\377\377\377\377\377\377"- 00:08:01.105 [2024-12-05 12:50:04.381638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.105 [2024-12-05 12:50:04.381665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.105 [2024-12-05 12:50:04.381728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.105 [2024-12-05 12:50:04.381747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.105 #48 NEW cov: 12358 ft: 13669 corp: 6/616b lim: 320 exec/s: 0 rss: 72Mb L: 136/136 MS: 1 ChangeByte- 00:08:01.366 [2024-12-05 12:50:04.421701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.366 [2024-12-05 12:50:04.421730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.366 [2024-12-05 12:50:04.421792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.366 [2024-12-05 12:50:04.421811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.366 #49 NEW cov: 12358 ft: 13697 corp: 7/752b lim: 320 exec/s: 0 rss: 72Mb L: 136/136 MS: 1 CopyPart- 00:08:01.366 [2024-12-05 12:50:04.481916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.366 [2024-12-05 12:50:04.481941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.366 [2024-12-05 12:50:04.482020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:9c540404 cdw11:00000240 00:08:01.366 [2024-12-05 12:50:04.482039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.366 #50 NEW cov: 12358 ft: 13757 corp: 8/888b lim: 320 exec/s: 0 rss: 72Mb L: 136/136 MS: 1 PersAutoDict- DE: "T\234@\002\000\000\000\000"- 00:08:01.366 [2024-12-05 12:50:04.521989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:4232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.366 [2024-12-05 12:50:04.522014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.366 [2024-12-05 12:50:04.522076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.366 [2024-12-05 12:50:04.522095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.366 #51 NEW cov: 12358 ft: 13784 corp: 9/1021b lim: 320 exec/s: 0 rss: 72Mb L: 133/136 MS: 1 InsertByte- 00:08:01.366 [2024-12-05 12:50:04.582099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:08:01.366 [2024-12-05 12:50:04.582124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.366 #52 NEW cov: 12358 ft: 13839 corp: 10/1147b lim: 320 exec/s: 0 rss: 72Mb L: 126/136 MS: 1 InsertRepeatedBytes- 00:08:01.366 [2024-12-05 12:50:04.622185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:01.366 [2024-12-05 12:50:04.622213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.366 #53 NEW cov: 12358 ft: 13964 corp: 11/1223b lim: 320 exec/s: 0 rss: 72Mb L: 76/136 MS: 1 InsertByte- 00:08:01.626 [2024-12-05 12:50:04.682424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.626 [2024-12-05 12:50:04.682449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.626 [2024-12-05 12:50:04.682510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.626 [2024-12-05 12:50:04.682529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.626 #54 NEW cov: 12358 ft: 13986 corp: 12/1359b lim: 320 exec/s: 0 rss: 72Mb L: 136/136 MS: 1 PersAutoDict- DE: "T\234@\002\000\000\000\000"- 00:08:01.626 [2024-12-05 12:50:04.722443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:08:01.626 [2024-12-05 12:50:04.722468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.626 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:01.626 #55 NEW cov: 12381 ft: 14048 corp: 13/1485b lim: 320 exec/s: 0 rss: 72Mb L: 126/136 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:01.626 [2024-12-05 12:50:04.782673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:aa cdw10:00000000 cdw11:00000000 00:08:01.626 [2024-12-05 12:50:04.782699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.626 #56 NEW cov: 12381 ft: 14130 corp: 14/1562b lim: 320 exec/s: 0 rss: 72Mb L: 77/136 MS: 1 InsertByte- 00:08:01.626 [2024-12-05 12:50:04.842822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:08:01.626 [2024-12-05 12:50:04.842851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.626 #57 NEW cov: 12381 ft: 14173 corp: 15/1689b lim: 320 exec/s: 0 rss: 73Mb L: 127/136 MS: 1 InsertByte- 00:08:01.626 [2024-12-05 12:50:04.883060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.626 [2024-12-05 12:50:04.883085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.626 [2024-12-05 12:50:04.883147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.626 [2024-12-05 12:50:04.883165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.626 #58 NEW cov: 12381 ft: 14225 corp: 16/1825b lim: 320 exec/s: 58 rss: 73Mb L: 136/136 MS: 1 ChangeBinInt- 00:08:01.626 [2024-12-05 12:50:04.923140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:4232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.626 [2024-12-05 12:50:04.923165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.626 [2024-12-05 12:50:04.923226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.626 [2024-12-05 12:50:04.923245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.886 #59 NEW cov: 12381 ft: 14240 corp: 17/1957b lim: 320 exec/s: 59 rss: 73Mb L: 132/136 MS: 1 ChangeBit- 00:08:01.887 [2024-12-05 12:50:04.963409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.887 [2024-12-05 12:50:04.963435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.887 [2024-12-05 12:50:04.963498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:fdfdfdfd cdw11:fdfdfdfd 00:08:01.887 [2024-12-05 12:50:04.963516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.887 [2024-12-05 12:50:04.963584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (fd) qid:0 cid:6 nsid:fdfdfdfd cdw10:fdfdfdfd cdw11:fdfdfdfd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.887 [2024-12-05 12:50:04.963600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.887 NEW_FUNC[1/1]: 0x198fb18 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:08:01.887 #60 NEW cov: 12395 ft: 14771 corp: 18/2195b lim: 320 exec/s: 60 rss: 73Mb L: 238/238 MS: 1 InsertRepeatedBytes- 00:08:01.887 [2024-12-05 12:50:05.023461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:232c2323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.887 [2024-12-05 12:50:05.023486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.887 [2024-12-05 12:50:05.023549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.887 [2024-12-05 12:50:05.023568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.887 #66 NEW cov: 12395 ft: 14784 corp: 19/2331b lim: 320 exec/s: 66 rss: 73Mb L: 136/238 MS: 1 ChangeByte- 00:08:01.887 [2024-12-05 12:50:05.063622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.887 [2024-12-05 12:50:05.063647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.887 [2024-12-05 12:50:05.063729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:01.887 [2024-12-05 12:50:05.063749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.887 [2024-12-05 12:50:05.063821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:6 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:01.887 [2024-12-05 12:50:05.063845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.887 #67 NEW cov: 12395 ft: 14839 corp: 20/2550b lim: 320 exec/s: 67 rss: 73Mb L: 219/238 MS: 1 CrossOver- 00:08:01.887 [2024-12-05 12:50:05.123615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:2000000 cdw10:00000000 cdw11:00000000 00:08:01.887 [2024-12-05 12:50:05.123640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.887 [2024-12-05 12:50:05.163725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:2000000 cdw10:00000000 cdw11:00000000 00:08:01.887 [2024-12-05 12:50:05.163750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.887 #69 NEW cov: 12395 ft: 14882 corp: 21/2627b lim: 320 exec/s: 69 rss: 73Mb L: 77/238 MS: 2 CMP-InsertByte- DE: "\002\000"- 00:08:02.147 [2024-12-05 12:50:05.204115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:02.147 [2024-12-05 12:50:05.204140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.147 [2024-12-05 12:50:05.204204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:fdfdfdfd cdw11:fdfdfdfd 00:08:02.147 [2024-12-05 12:50:05.204222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.147 [2024-12-05 12:50:05.204290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (fd) qid:0 cid:6 nsid:fdfdfdfd cdw10:fdfdfdfd cdw11:fdfdfdfd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.147 [2024-12-05 12:50:05.204308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.147 #75 NEW cov: 12395 ft: 14894 corp: 22/2865b lim: 320 exec/s: 75 rss: 73Mb L: 238/238 MS: 1 ChangeBinInt- 00:08:02.147 [2024-12-05 12:50:05.264009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:02.147 [2024-12-05 12:50:05.264034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.147 #76 NEW cov: 12395 ft: 14899 corp: 23/2939b lim: 320 exec/s: 76 rss: 73Mb L: 74/238 MS: 1 EraseBytes- 00:08:02.147 [2024-12-05 12:50:05.304120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:02.147 [2024-12-05 12:50:05.304144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.147 #77 NEW cov: 12395 ft: 14946 corp: 24/3014b lim: 320 exec/s: 77 rss: 73Mb L: 75/238 MS: 1 ChangeBinInt- 00:08:02.147 [2024-12-05 12:50:05.344272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:aa cdw10:00000000 cdw11:00000000 00:08:02.147 [2024-12-05 12:50:05.344298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.147 #78 NEW cov: 12395 ft: 14959 corp: 25/3099b lim: 320 exec/s: 78 rss: 73Mb L: 85/238 MS: 1 PersAutoDict- DE: "T\234@\002\000\000\000\000"- 00:08:02.147 [2024-12-05 12:50:05.404422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:02.147 [2024-12-05 12:50:05.404451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.147 #79 NEW cov: 12395 ft: 14996 corp: 26/3206b lim: 320 exec/s: 79 rss: 73Mb L: 107/238 MS: 1 InsertRepeatedBytes- 00:08:02.407 [2024-12-05 12:50:05.464599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23230002 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:02.407 [2024-12-05 12:50:05.464626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.407 #80 NEW cov: 12395 ft: 15001 corp: 27/3313b lim: 320 exec/s: 80 rss: 73Mb L: 107/238 MS: 1 PersAutoDict- DE: "\002\000"- 00:08:02.407 [2024-12-05 12:50:05.524766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:02.407 [2024-12-05 12:50:05.524792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.407 #81 NEW cov: 12395 ft: 15013 corp: 28/3388b lim: 320 exec/s: 81 rss: 73Mb L: 75/238 MS: 1 ChangeByte- 00:08:02.407 [2024-12-05 12:50:05.584912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:02.407 [2024-12-05 12:50:05.584938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.407 #82 NEW cov: 12395 ft: 15024 corp: 29/3495b lim: 320 exec/s: 82 rss: 73Mb L: 107/238 MS: 1 ChangeBit- 00:08:02.407 [2024-12-05 12:50:05.625140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:02.407 [2024-12-05 12:50:05.625166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.407 [2024-12-05 12:50:05.625226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04040404 cdw11:04040404 00:08:02.407 [2024-12-05 12:50:05.625246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.407 #83 NEW cov: 12395 ft: 15093 corp: 30/3631b lim: 320 exec/s: 83 rss: 73Mb L: 136/238 MS: 1 ChangeBinInt- 00:08:02.407 [2024-12-05 12:50:05.665360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:02.407 [2024-12-05 12:50:05.665386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.407 [2024-12-05 12:50:05.665448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:fdfd2cfd cdw11:fdfdfdfd 00:08:02.407 [2024-12-05 12:50:05.665468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.407 [2024-12-05 12:50:05.665535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (fd) qid:0 cid:6 nsid:fdfdfdfd cdw10:fdfdfdfd cdw11:fdfdfdfd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.407 [2024-12-05 12:50:05.665552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.407 #84 NEW cov: 12395 ft: 15108 corp: 31/3869b lim: 320 exec/s: 84 rss: 73Mb L: 238/238 MS: 1 ChangeByte- 00:08:02.666 [2024-12-05 12:50:05.725305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:2000000 cdw10:00000000 cdw11:00000000 00:08:02.666 [2024-12-05 12:50:05.725332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.666 #85 NEW cov: 12395 ft: 15116 corp: 32/3946b lim: 320 exec/s: 85 rss: 74Mb L: 77/238 MS: 1 ChangeBinInt- 00:08:02.666 [2024-12-05 12:50:05.785460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:02.666 [2024-12-05 12:50:05.785485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.666 #86 NEW cov: 12395 ft: 15122 corp: 33/4021b lim: 320 exec/s: 86 rss: 74Mb L: 75/238 MS: 1 ChangeBit- 00:08:02.666 [2024-12-05 12:50:05.845729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (23) qid:0 cid:4 nsid:23232323 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040404 00:08:02.666 [2024-12-05 12:50:05.845754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.666 [2024-12-05 12:50:05.845815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:04043b04 cdw11:04040404 00:08:02.666 [2024-12-05 12:50:05.845839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.666 #87 NEW cov: 12395 ft: 15127 corp: 34/4158b lim: 320 exec/s: 43 rss: 74Mb L: 137/238 MS: 1 InsertByte- 00:08:02.666 #87 DONE cov: 12395 ft: 15127 corp: 34/4158b lim: 320 exec/s: 43 rss: 74Mb 00:08:02.666 ###### Recommended dictionary. ###### 00:08:02.666 "T\234@\002\000\000\000\000" # Uses: 3 00:08:02.666 "\377\377\377\377\377\377\377\377" # Uses: 2 00:08:02.666 "\002\000" # Uses: 1 00:08:02.666 ###### End of recommended dictionary. ###### 00:08:02.666 Done 87 runs in 2 second(s) 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:08:02.925 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:02.926 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:02.926 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:08:02.926 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:08:02.926 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:02.926 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:08:02.926 12:50:05 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.926 12:50:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:02.926 12:50:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:02.926 12:50:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:08:02.926 [2024-12-05 12:50:06.032046] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:02.926 [2024-12-05 12:50:06.032116] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149179 ] 00:08:03.186 [2024-12-05 12:50:06.309299] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.186 [2024-12-05 12:50:06.332118] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.186 [2024-12-05 12:50:06.384447] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.186 [2024-12-05 12:50:06.400778] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:08:03.186 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.186 INFO: Seed: 2167478333 00:08:03.186 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:03.186 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:03.186 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:03.186 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.186 #2 INITED exec/s: 0 rss: 65Mb 00:08:03.186 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.186 This may also happen if the target rejected all inputs we tried so far 00:08:03.186 [2024-12-05 12:50:06.470664] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.186 [2024-12-05 12:50:06.470927] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.186 [2024-12-05 12:50:06.471168] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.186 [2024-12-05 12:50:06.471418] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.186 [2024-12-05 12:50:06.471877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.186 [2024-12-05 12:50:06.471920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.186 [2024-12-05 12:50:06.471995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.186 [2024-12-05 12:50:06.472011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.186 [2024-12-05 12:50:06.472082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.186 [2024-12-05 12:50:06.472098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.186 [2024-12-05 12:50:06.472166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.186 [2024-12-05 12:50:06.472181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.706 NEW_FUNC[1/717]: 0x4530a8 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:08:03.706 NEW_FUNC[2/717]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.706 #5 NEW cov: 12167 ft: 12145 corp: 2/26b lim: 30 exec/s: 0 rss: 73Mb L: 25/25 MS: 3 ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:03.706 [2024-12-05 12:50:06.810757] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000dfdf 00:08:03.706 [2024-12-05 12:50:06.811167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:dfdf83df cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.706 [2024-12-05 12:50:06.811209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.706 #6 NEW cov: 12304 ft: 13614 corp: 3/34b lim: 30 exec/s: 0 rss: 73Mb L: 8/25 MS: 1 InsertRepeatedBytes- 00:08:03.706 [2024-12-05 12:50:06.861064] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.706 [2024-12-05 12:50:06.861221] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.706 [2024-12-05 12:50:06.861356] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.706 [2024-12-05 12:50:06.861495] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.706 [2024-12-05 12:50:06.861808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.706 [2024-12-05 12:50:06.861841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:06.861959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:06.861977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:06.862092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d53f81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:06.862112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:06.862237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:06.862255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.707 #7 NEW cov: 12310 ft: 13805 corp: 4/60b lim: 30 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 InsertByte- 00:08:03.707 [2024-12-05 12:50:06.920998] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000dfdf 00:08:03.707 [2024-12-05 12:50:06.921316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0adf83df cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:06.921344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.707 #9 NEW cov: 12395 ft: 14104 corp: 5/68b lim: 30 exec/s: 0 rss: 73Mb L: 8/26 MS: 2 ShuffleBytes-CrossOver- 00:08:03.707 [2024-12-05 12:50:06.961332] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.707 [2024-12-05 12:50:06.961502] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.707 [2024-12-05 12:50:06.961651] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.707 [2024-12-05 12:50:06.961791] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.707 [2024-12-05 12:50:06.962146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:06.962178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:06.962300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:06.962318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:06.962436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:06.962454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:06.962582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:06.962600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.707 #10 NEW cov: 12395 ft: 14232 corp: 6/93b lim: 30 exec/s: 0 rss: 73Mb L: 25/26 MS: 1 ShuffleBytes- 00:08:03.707 [2024-12-05 12:50:07.001403] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.707 [2024-12-05 12:50:07.001571] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.707 [2024-12-05 12:50:07.001712] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.707 [2024-12-05 12:50:07.001860] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.707 [2024-12-05 12:50:07.002185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3fd581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:07.002215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:07.002335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:07.002352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:07.002471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d53f81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:07.002489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.707 [2024-12-05 12:50:07.002613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.707 [2024-12-05 12:50:07.002634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.967 #16 NEW cov: 12395 ft: 14320 corp: 7/119b lim: 30 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 ChangeByte- 00:08:03.967 [2024-12-05 12:50:07.071701] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.967 [2024-12-05 12:50:07.071875] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.967 [2024-12-05 12:50:07.072030] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.967 [2024-12-05 12:50:07.072176] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.967 [2024-12-05 12:50:07.072484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3bd581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.967 [2024-12-05 12:50:07.072513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 [2024-12-05 12:50:07.072636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.967 [2024-12-05 12:50:07.072655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.967 [2024-12-05 12:50:07.072775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d53f81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.967 [2024-12-05 12:50:07.072794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.967 [2024-12-05 12:50:07.072913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.967 [2024-12-05 12:50:07.072933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.967 #17 NEW cov: 12395 ft: 14377 corp: 8/145b lim: 30 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 ChangeByte- 00:08:03.967 [2024-12-05 12:50:07.141717] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.967 [2024-12-05 12:50:07.141884] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.967 [2024-12-05 12:50:07.142194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3b81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.967 [2024-12-05 12:50:07.142224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 [2024-12-05 12:50:07.142346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.967 [2024-12-05 12:50:07.142364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.967 #19 NEW cov: 12395 ft: 14685 corp: 9/162b lim: 30 exec/s: 0 rss: 73Mb L: 17/26 MS: 2 ShuffleBytes-CrossOver- 00:08:03.967 [2024-12-05 12:50:07.181810] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (28804) > buf size (4096) 00:08:03.967 [2024-12-05 12:50:07.182149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1c200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.967 [2024-12-05 12:50:07.182178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.967 #20 NEW cov: 12418 ft: 14746 corp: 10/170b lim: 30 exec/s: 0 rss: 73Mb L: 8/26 MS: 1 ChangeBinInt- 00:08:03.967 [2024-12-05 12:50:07.242073] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.967 [2024-12-05 12:50:07.242224] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:03.967 [2024-12-05 12:50:07.242359] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d5d5 00:08:03.967 [2024-12-05 12:50:07.242710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0d0a813b cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.967 [2024-12-05 12:50:07.242740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.968 [2024-12-05 12:50:07.242873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.968 [2024-12-05 12:50:07.242891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.968 [2024-12-05 12:50:07.243013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d583d5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.968 [2024-12-05 12:50:07.243034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.968 #22 NEW cov: 12418 ft: 14986 corp: 11/188b lim: 30 exec/s: 0 rss: 73Mb L: 18/26 MS: 2 InsertByte-CrossOver- 00:08:04.228 [2024-12-05 12:50:07.282125] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.228 [2024-12-05 12:50:07.282281] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d53f 00:08:04.228 [2024-12-05 12:50:07.282639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0d0a813b cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.228 [2024-12-05 12:50:07.282669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.228 [2024-12-05 12:50:07.282790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.228 [2024-12-05 12:50:07.282809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.228 #23 NEW cov: 12418 ft: 15132 corp: 12/202b lim: 30 exec/s: 0 rss: 73Mb L: 14/26 MS: 1 EraseBytes- 00:08:04.228 [2024-12-05 12:50:07.352525] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.228 [2024-12-05 12:50:07.352694] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.352843] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.352981] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.353327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.353358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.353473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.353490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.353611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.353629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.353747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.353766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.229 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:04.229 #24 NEW cov: 12441 ft: 15167 corp: 13/230b lim: 30 exec/s: 0 rss: 73Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:04.229 [2024-12-05 12:50:07.392625] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.392779] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.392921] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.393066] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.393403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3fd581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.393432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.393551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.393570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.393693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d53f813d cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.393711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.393837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.393855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.229 #25 NEW cov: 12441 ft: 15192 corp: 14/257b lim: 30 exec/s: 0 rss: 73Mb L: 27/28 MS: 1 InsertByte- 00:08:04.229 [2024-12-05 12:50:07.432738] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.432905] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.433057] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.229 [2024-12-05 12:50:07.433190] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d52b 00:08:04.229 [2024-12-05 12:50:07.433547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3bd581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.433576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.433695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.433713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.433840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d53f81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.433861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.229 [2024-12-05 12:50:07.433981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.434001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.229 #26 NEW cov: 12441 ft: 15220 corp: 15/283b lim: 30 exec/s: 26 rss: 74Mb L: 26/28 MS: 1 ChangeByte- 00:08:04.229 [2024-12-05 12:50:07.502647] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (28804) > buf size (4096) 00:08:04.229 [2024-12-05 12:50:07.502983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1c200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.229 [2024-12-05 12:50:07.503012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.490 #27 NEW cov: 12441 ft: 15273 corp: 16/291b lim: 30 exec/s: 27 rss: 74Mb L: 8/28 MS: 1 ChangeBit- 00:08:04.490 [2024-12-05 12:50:07.562991] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.563141] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.563282] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.563422] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.563740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.563770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.490 [2024-12-05 12:50:07.563905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.563925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.490 [2024-12-05 12:50:07.564043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.564061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.490 [2024-12-05 12:50:07.564193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d58124 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.564214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.490 #28 NEW cov: 12441 ft: 15285 corp: 17/317b lim: 30 exec/s: 28 rss: 74Mb L: 26/28 MS: 1 InsertByte- 00:08:04.490 [2024-12-05 12:50:07.623270] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.623447] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.623586] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.623723] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.624073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.624101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.490 [2024-12-05 12:50:07.624217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.624235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.490 [2024-12-05 12:50:07.624356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2ed581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.624376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.490 [2024-12-05 12:50:07.624498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.624517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.490 #29 NEW cov: 12441 ft: 15306 corp: 18/345b lim: 30 exec/s: 29 rss: 74Mb L: 28/28 MS: 1 ChangeByte- 00:08:04.490 [2024-12-05 12:50:07.683281] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.683460] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.683791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3b814c cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.683820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.490 [2024-12-05 12:50:07.683938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.683957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.490 #30 NEW cov: 12441 ft: 15328 corp: 19/362b lim: 30 exec/s: 30 rss: 74Mb L: 17/28 MS: 1 ChangeByte- 00:08:04.490 [2024-12-05 12:50:07.753548] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.753708] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.490 [2024-12-05 12:50:07.753849] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d5d5 00:08:04.490 [2024-12-05 12:50:07.754195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3b81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.490 [2024-12-05 12:50:07.754224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.490 [2024-12-05 12:50:07.754342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.491 [2024-12-05 12:50:07.754358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.491 [2024-12-05 12:50:07.754486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d5834b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.491 [2024-12-05 12:50:07.754505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.491 #31 NEW cov: 12441 ft: 15341 corp: 20/380b lim: 30 exec/s: 31 rss: 74Mb L: 18/28 MS: 1 InsertByte- 00:08:04.491 [2024-12-05 12:50:07.793714] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.491 [2024-12-05 12:50:07.793880] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.491 [2024-12-05 12:50:07.794015] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d9 00:08:04.491 [2024-12-05 12:50:07.794152] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.491 [2024-12-05 12:50:07.794474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.491 [2024-12-05 12:50:07.794505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.491 [2024-12-05 12:50:07.794628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.491 [2024-12-05 12:50:07.794647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.491 [2024-12-05 12:50:07.794775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.491 [2024-12-05 12:50:07.794794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.491 [2024-12-05 12:50:07.794921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.491 [2024-12-05 12:50:07.794940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.752 #32 NEW cov: 12441 ft: 15373 corp: 21/409b lim: 30 exec/s: 32 rss: 74Mb L: 29/29 MS: 1 CopyPart- 00:08:04.752 [2024-12-05 12:50:07.843776] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:07.843939] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:07.844081] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d53f 00:08:04.752 [2024-12-05 12:50:07.844409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3b814c cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.844438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:07.844552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.844571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:07.844688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.844706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.752 #33 NEW cov: 12441 ft: 15391 corp: 22/429b lim: 30 exec/s: 33 rss: 74Mb L: 20/29 MS: 1 CrossOver- 00:08:04.752 [2024-12-05 12:50:07.904076] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:07.904235] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:07.904374] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:07.904520] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (481112) > buf size (4096) 00:08:04.752 [2024-12-05 12:50:07.904665] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:07.905017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.905046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:07.905166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.905185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:07.905302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.905322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:07.905439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.905458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:07.905586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.905603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.752 #34 NEW cov: 12441 ft: 15433 corp: 23/459b lim: 30 exec/s: 34 rss: 74Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:04.752 [2024-12-05 12:50:07.944038] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:07.944203] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:07.944344] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d531 00:08:04.752 [2024-12-05 12:50:07.944667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3b81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.944696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:07.944816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.944839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:07.944960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d5834b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:07.944978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.752 #35 NEW cov: 12441 ft: 15465 corp: 24/477b lim: 30 exec/s: 35 rss: 74Mb L: 18/30 MS: 1 ChangeByte- 00:08:04.752 [2024-12-05 12:50:08.004224] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:08.004392] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:04.752 [2024-12-05 12:50:08.004532] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d53f 00:08:04.752 [2024-12-05 12:50:08.004873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3b814c cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:08.004906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:08.005027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:08.005045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.752 [2024-12-05 12:50:08.005164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.752 [2024-12-05 12:50:08.005182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.752 #36 NEW cov: 12441 ft: 15475 corp: 25/497b lim: 30 exec/s: 36 rss: 74Mb L: 20/30 MS: 1 CrossOver- 00:08:05.013 [2024-12-05 12:50:08.074570] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.074745] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.074892] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.075032] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.075382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.075410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.075526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.075542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.075661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.075677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.075791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d58124 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.075809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.013 #37 NEW cov: 12441 ft: 15486 corp: 26/524b lim: 30 exec/s: 37 rss: 74Mb L: 27/30 MS: 1 InsertByte- 00:08:05.013 [2024-12-05 12:50:08.134643] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.134813] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.134995] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.135140] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.135463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.135491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.135607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.135624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.135735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.135754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.135870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d53d81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.135888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.013 #38 NEW cov: 12441 ft: 15516 corp: 27/553b lim: 30 exec/s: 38 rss: 74Mb L: 29/30 MS: 1 InsertByte- 00:08:05.013 [2024-12-05 12:50:08.174778] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.174945] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.175094] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d595 00:08:05.013 [2024-12-05 12:50:08.175240] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d52b 00:08:05.013 [2024-12-05 12:50:08.175562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:3bd581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.175591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.175706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.175723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.175841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d53f81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.175856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.175982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.175999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.013 #39 NEW cov: 12441 ft: 15520 corp: 28/579b lim: 30 exec/s: 39 rss: 75Mb L: 26/30 MS: 1 ChangeBit- 00:08:05.013 [2024-12-05 12:50:08.234740] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.235090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a3b81d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.235118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.013 #40 NEW cov: 12441 ft: 15525 corp: 29/590b lim: 30 exec/s: 40 rss: 75Mb L: 11/30 MS: 1 EraseBytes- 00:08:05.013 [2024-12-05 12:50:08.275079] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.275243] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.275382] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.275527] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.013 [2024-12-05 12:50:08.275860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.275887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.276019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.276039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.276153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2ed581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.276172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.013 [2024-12-05 12:50:08.276284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.013 [2024-12-05 12:50:08.276301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.013 #41 NEW cov: 12441 ft: 15534 corp: 30/618b lim: 30 exec/s: 41 rss: 75Mb L: 28/30 MS: 1 ChangeByte- 00:08:05.273 [2024-12-05 12:50:08.345249] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.273 [2024-12-05 12:50:08.345412] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.273 [2024-12-05 12:50:08.345555] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.273 [2024-12-05 12:50:08.345706] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.273 [2024-12-05 12:50:08.346038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.273 [2024-12-05 12:50:08.346066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.273 [2024-12-05 12:50:08.346185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.273 [2024-12-05 12:50:08.346205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.273 [2024-12-05 12:50:08.346320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.273 [2024-12-05 12:50:08.346338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.273 [2024-12-05 12:50:08.346456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.273 [2024-12-05 12:50:08.346472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.273 [2024-12-05 12:50:08.385402] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.273 [2024-12-05 12:50:08.385574] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.273 [2024-12-05 12:50:08.385714] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d5d5 00:08:05.274 [2024-12-05 12:50:08.385868] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:05.274 [2024-12-05 12:50:08.386178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:d9d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.274 [2024-12-05 12:50:08.386207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.274 [2024-12-05 12:50:08.386334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.274 [2024-12-05 12:50:08.386354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.274 [2024-12-05 12:50:08.386476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d5d581d5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.274 [2024-12-05 12:50:08.386495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.274 [2024-12-05 12:50:08.386616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d5d583d5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.274 [2024-12-05 12:50:08.386633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.274 #43 NEW cov: 12441 ft: 15564 corp: 31/643b lim: 30 exec/s: 43 rss: 75Mb L: 25/30 MS: 2 CopyPart-CMP- DE: "\377\377\377\365"- 00:08:05.274 [2024-12-05 12:50:08.425303] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (28804) > buf size (4096) 00:08:05.274 [2024-12-05 12:50:08.425666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1c200020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.274 [2024-12-05 12:50:08.425695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.274 #44 NEW cov: 12441 ft: 15573 corp: 32/651b lim: 30 exec/s: 22 rss: 75Mb L: 8/30 MS: 1 ChangeBit- 00:08:05.274 #44 DONE cov: 12441 ft: 15573 corp: 32/651b lim: 30 exec/s: 22 rss: 75Mb 00:08:05.274 ###### Recommended dictionary. ###### 00:08:05.274 "\377\377\377\365" # Uses: 0 00:08:05.274 ###### End of recommended dictionary. ###### 00:08:05.274 Done 44 runs in 2 second(s) 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:05.274 12:50:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:08:05.534 [2024-12-05 12:50:08.611358] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:05.534 [2024-12-05 12:50:08.611426] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149487 ] 00:08:05.534 [2024-12-05 12:50:08.817090] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.534 [2024-12-05 12:50:08.829806] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.793 [2024-12-05 12:50:08.882451] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.793 [2024-12-05 12:50:08.898768] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:08:05.793 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.793 INFO: Seed: 372528933 00:08:05.793 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:05.793 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:05.793 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:05.793 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.793 #2 INITED exec/s: 0 rss: 64Mb 00:08:05.793 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.793 This may also happen if the target rejected all inputs we tried so far 00:08:05.793 [2024-12-05 12:50:08.974957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:2f00ff21 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.793 [2024-12-05 12:50:08.974995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.053 NEW_FUNC[1/716]: 0x455b58 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:08:06.053 NEW_FUNC[2/716]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.053 #11 NEW cov: 12123 ft: 12122 corp: 2/8b lim: 35 exec/s: 0 rss: 72Mb L: 7/7 MS: 4 ShuffleBytes-CrossOver-CMP-InsertByte- DE: "\377\377\377!"- 00:08:06.053 [2024-12-05 12:50:09.306031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:2100ff21 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.053 [2024-12-05 12:50:09.306084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.053 #12 NEW cov: 12259 ft: 12834 corp: 3/18b lim: 35 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 CopyPart- 00:08:06.312 [2024-12-05 12:50:09.376128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d9d9000e cdw11:d900d9d9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.312 [2024-12-05 12:50:09.376155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.312 #14 NEW cov: 12265 ft: 13040 corp: 4/30b lim: 35 exec/s: 0 rss: 72Mb L: 12/12 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:06.312 [2024-12-05 12:50:09.426157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.312 [2024-12-05 12:50:09.426186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.312 #20 NEW cov: 12350 ft: 13274 corp: 5/39b lim: 35 exec/s: 0 rss: 72Mb L: 9/12 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:06.312 [2024-12-05 12:50:09.476414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.312 [2024-12-05 12:50:09.476443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.312 #21 NEW cov: 12350 ft: 13355 corp: 6/46b lim: 35 exec/s: 0 rss: 72Mb L: 7/12 MS: 1 CopyPart- 00:08:06.312 [2024-12-05 12:50:09.526514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:f8ff000a cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.312 [2024-12-05 12:50:09.526544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.312 #22 NEW cov: 12350 ft: 13451 corp: 7/55b lim: 35 exec/s: 0 rss: 72Mb L: 9/12 MS: 1 ChangeBinInt- 00:08:06.312 [2024-12-05 12:50:09.596750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.312 [2024-12-05 12:50:09.596777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.312 #23 NEW cov: 12350 ft: 13499 corp: 8/64b lim: 35 exec/s: 0 rss: 72Mb L: 9/12 MS: 1 PersAutoDict- DE: "\377\377\377!"- 00:08:06.571 [2024-12-05 12:50:09.646911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d90a000e cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.571 [2024-12-05 12:50:09.646939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.571 #24 NEW cov: 12350 ft: 13587 corp: 9/76b lim: 35 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 CrossOver- 00:08:06.571 [2024-12-05 12:50:09.717642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4747008f cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.571 [2024-12-05 12:50:09.717668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.571 [2024-12-05 12:50:09.717804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:47470047 cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.571 [2024-12-05 12:50:09.717821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.571 [2024-12-05 12:50:09.717957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:47470047 cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.571 [2024-12-05 12:50:09.717976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.571 #26 NEW cov: 12350 ft: 13973 corp: 10/97b lim: 35 exec/s: 0 rss: 72Mb L: 21/21 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:06.571 [2024-12-05 12:50:09.767383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff27000a cdw11:2f00ff21 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.571 [2024-12-05 12:50:09.767412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.571 #27 NEW cov: 12350 ft: 14054 corp: 11/104b lim: 35 exec/s: 0 rss: 72Mb L: 7/21 MS: 1 ChangeByte- 00:08:06.571 [2024-12-05 12:50:09.817391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d90a000e cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.571 [2024-12-05 12:50:09.817416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.571 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:06.571 #33 NEW cov: 12373 ft: 14084 corp: 12/116b lim: 35 exec/s: 0 rss: 72Mb L: 12/21 MS: 1 ChangeByte- 00:08:06.831 [2024-12-05 12:50:09.887972] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:06.831 [2024-12-05 12:50:09.888532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4747008f cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:09.888562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.831 [2024-12-05 12:50:09.888700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:47470047 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:09.888721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.831 [2024-12-05 12:50:09.888863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:47000047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:09.888889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.831 [2024-12-05 12:50:09.889024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:47470047 cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:09.889041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.831 #34 NEW cov: 12384 ft: 14696 corp: 13/145b lim: 35 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:06.831 [2024-12-05 12:50:09.957843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:d90a000e cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:09.957873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.831 #35 NEW cov: 12384 ft: 14739 corp: 14/158b lim: 35 exec/s: 35 rss: 73Mb L: 13/29 MS: 1 InsertByte- 00:08:06.831 [2024-12-05 12:50:10.028400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4747008f cdw11:0a004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:10.028432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.831 [2024-12-05 12:50:10.028580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff2100ff cdw11:47002147 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:10.028598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.831 #36 NEW cov: 12384 ft: 14955 corp: 15/178b lim: 35 exec/s: 36 rss: 73Mb L: 20/29 MS: 1 CrossOver- 00:08:06.831 [2024-12-05 12:50:10.078538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:10.078570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.831 [2024-12-05 12:50:10.078706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.831 [2024-12-05 12:50:10.078726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.831 #37 NEW cov: 12384 ft: 14961 corp: 16/194b lim: 35 exec/s: 37 rss: 73Mb L: 16/29 MS: 1 CopyPart- 00:08:07.091 [2024-12-05 12:50:10.148733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.091 [2024-12-05 12:50:10.148762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.091 [2024-12-05 12:50:10.148900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.091 [2024-12-05 12:50:10.148920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.091 #38 NEW cov: 12384 ft: 14970 corp: 17/212b lim: 35 exec/s: 38 rss: 73Mb L: 18/29 MS: 1 InsertRepeatedBytes- 00:08:07.091 [2024-12-05 12:50:10.198846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:f700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.091 [2024-12-05 12:50:10.198874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.091 [2024-12-05 12:50:10.199004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.091 [2024-12-05 12:50:10.199024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.091 #39 NEW cov: 12384 ft: 14992 corp: 18/230b lim: 35 exec/s: 39 rss: 73Mb L: 18/29 MS: 1 ChangeBit- 00:08:07.091 [2024-12-05 12:50:10.269378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:473e008f cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.091 [2024-12-05 12:50:10.269407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.091 [2024-12-05 12:50:10.269541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:47470047 cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.091 [2024-12-05 12:50:10.269559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.091 [2024-12-05 12:50:10.269687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:47470047 cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.092 [2024-12-05 12:50:10.269705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.092 #40 NEW cov: 12384 ft: 15016 corp: 19/252b lim: 35 exec/s: 40 rss: 73Mb L: 22/29 MS: 1 InsertByte- 00:08:07.092 [2024-12-05 12:50:10.318669] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.092 [2024-12-05 12:50:10.319054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.092 [2024-12-05 12:50:10.319089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.092 #41 NEW cov: 12384 ft: 15049 corp: 20/262b lim: 35 exec/s: 41 rss: 73Mb L: 10/29 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:07.092 [2024-12-05 12:50:10.389500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4747008f cdw11:0a004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.092 [2024-12-05 12:50:10.389529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.092 [2024-12-05 12:50:10.389665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff2100ff cdw11:00002104 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.092 [2024-12-05 12:50:10.389681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.351 #42 NEW cov: 12384 ft: 15077 corp: 21/282b lim: 35 exec/s: 42 rss: 73Mb L: 20/29 MS: 1 CMP- DE: "\004\000"- 00:08:07.351 [2024-12-05 12:50:10.459676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.351 [2024-12-05 12:50:10.459704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.351 [2024-12-05 12:50:10.459840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.351 [2024-12-05 12:50:10.459860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.351 #43 NEW cov: 12384 ft: 15104 corp: 22/302b lim: 35 exec/s: 43 rss: 73Mb L: 20/29 MS: 1 InsertRepeatedBytes- 00:08:07.351 [2024-12-05 12:50:10.529664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.351 [2024-12-05 12:50:10.529693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.351 #44 NEW cov: 12384 ft: 15109 corp: 23/311b lim: 35 exec/s: 44 rss: 73Mb L: 9/29 MS: 1 ChangeBinInt- 00:08:07.351 [2024-12-05 12:50:10.579840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.351 [2024-12-05 12:50:10.579874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.351 #45 NEW cov: 12384 ft: 15141 corp: 24/324b lim: 35 exec/s: 45 rss: 73Mb L: 13/29 MS: 1 PersAutoDict- DE: "\377\377\377!"- 00:08:07.351 [2024-12-05 12:50:10.630295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.351 [2024-12-05 12:50:10.630324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.351 [2024-12-05 12:50:10.630454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.351 [2024-12-05 12:50:10.630472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.610 #46 NEW cov: 12384 ft: 15148 corp: 25/344b lim: 35 exec/s: 46 rss: 73Mb L: 20/29 MS: 1 ChangeBit- 00:08:07.610 [2024-12-05 12:50:10.700198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:2f000721 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.700228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.610 #47 NEW cov: 12384 ft: 15154 corp: 26/351b lim: 35 exec/s: 47 rss: 73Mb L: 7/29 MS: 1 ChangeBinInt- 00:08:07.610 [2024-12-05 12:50:10.750623] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.610 [2024-12-05 12:50:10.751164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4747008f cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.751193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.610 [2024-12-05 12:50:10.751326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:47470047 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.751343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.610 [2024-12-05 12:50:10.751477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:47000047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.751500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.610 [2024-12-05 12:50:10.751629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:47470047 cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.751649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.610 #48 NEW cov: 12384 ft: 15186 corp: 27/381b lim: 35 exec/s: 48 rss: 73Mb L: 30/30 MS: 1 InsertByte- 00:08:07.610 [2024-12-05 12:50:10.820793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ff000a cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.820821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.610 [2024-12-05 12:50:10.820964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.820982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.610 #49 NEW cov: 12384 ft: 15204 corp: 28/397b lim: 35 exec/s: 49 rss: 73Mb L: 16/30 MS: 1 ChangeByte- 00:08:07.610 [2024-12-05 12:50:10.870600] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:07.610 [2024-12-05 12:50:10.871001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000008f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.871029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.610 [2024-12-05 12:50:10.871161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00210000 cdw11:00002104 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.610 [2024-12-05 12:50:10.871187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.610 #50 NEW cov: 12384 ft: 15220 corp: 29/417b lim: 35 exec/s: 50 rss: 74Mb L: 20/30 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:07.870 [2024-12-05 12:50:10.941367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4723008f cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.870 [2024-12-05 12:50:10.941393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.870 [2024-12-05 12:50:10.941524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:47470047 cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.870 [2024-12-05 12:50:10.941542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.870 [2024-12-05 12:50:10.941680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:47470047 cdw11:47004747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.870 [2024-12-05 12:50:10.941700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.870 #51 NEW cov: 12384 ft: 15234 corp: 30/439b lim: 35 exec/s: 25 rss: 74Mb L: 22/30 MS: 1 InsertByte- 00:08:07.870 #51 DONE cov: 12384 ft: 15234 corp: 30/439b lim: 35 exec/s: 25 rss: 74Mb 00:08:07.870 ###### Recommended dictionary. ###### 00:08:07.870 "\377\377\377!" # Uses: 3 00:08:07.870 "\000\000\000\000\000\000\000\000" # Uses: 2 00:08:07.870 "\004\000" # Uses: 0 00:08:07.870 ###### End of recommended dictionary. ###### 00:08:07.870 Done 51 runs in 2 second(s) 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:07.870 12:50:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:08:07.870 [2024-12-05 12:50:11.108402] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:07.870 [2024-12-05 12:50:11.108470] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid149994 ] 00:08:08.130 [2024-12-05 12:50:11.313947] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.130 [2024-12-05 12:50:11.326251] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.130 [2024-12-05 12:50:11.378629] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.130 [2024-12-05 12:50:11.394954] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:08:08.130 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.130 INFO: Seed: 2868520292 00:08:08.130 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:08.130 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:08.130 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:08.130 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.130 #2 INITED exec/s: 0 rss: 64Mb 00:08:08.130 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.130 This may also happen if the target rejected all inputs we tried so far 00:08:08.649 NEW_FUNC[1/705]: 0x457838 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:08:08.649 NEW_FUNC[2/705]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.649 #5 NEW cov: 12024 ft: 12024 corp: 2/5b lim: 20 exec/s: 0 rss: 72Mb L: 4/4 MS: 3 ChangeByte-CrossOver-CopyPart- 00:08:08.649 #7 NEW cov: 12154 ft: 12574 corp: 3/12b lim: 20 exec/s: 0 rss: 72Mb L: 7/7 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:08.649 [2024-12-05 12:50:11.831355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.649 [2024-12-05 12:50:11.831388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.649 NEW_FUNC[1/17]: 0x1394b98 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3484 00:08:08.649 NEW_FUNC[2/17]: 0x1395718 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3426 00:08:08.649 #8 NEW cov: 12420 ft: 13361 corp: 4/20b lim: 20 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 InsertByte- 00:08:08.649 #11 NEW cov: 12505 ft: 13765 corp: 5/30b lim: 20 exec/s: 0 rss: 72Mb L: 10/10 MS: 3 CopyPart-ChangeByte-CMP- DE: "\001\230\202\331W\015\330\004"- 00:08:08.649 [2024-12-05 12:50:11.931648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.649 [2024-12-05 12:50:11.931676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.908 #12 NEW cov: 12505 ft: 13872 corp: 6/38b lim: 20 exec/s: 0 rss: 72Mb L: 8/10 MS: 1 CopyPart- 00:08:08.908 NEW_FUNC[1/2]: 0x150c478 in nvmf_transport_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:819 00:08:08.908 NEW_FUNC[2/2]: 0x15329d8 in nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3687 00:08:08.908 #13 NEW cov: 12561 ft: 14034 corp: 7/46b lim: 20 exec/s: 0 rss: 72Mb L: 8/10 MS: 1 InsertByte- 00:08:08.908 #14 NEW cov: 12561 ft: 14082 corp: 8/52b lim: 20 exec/s: 0 rss: 72Mb L: 6/10 MS: 1 CopyPart- 00:08:08.908 #16 NEW cov: 12561 ft: 14125 corp: 9/63b lim: 20 exec/s: 0 rss: 72Mb L: 11/11 MS: 2 EraseBytes-CrossOver- 00:08:08.908 #17 NEW cov: 12561 ft: 14201 corp: 10/67b lim: 20 exec/s: 0 rss: 72Mb L: 4/11 MS: 1 ChangeBit- 00:08:08.908 #18 NEW cov: 12561 ft: 14268 corp: 11/71b lim: 20 exec/s: 0 rss: 72Mb L: 4/11 MS: 1 CopyPart- 00:08:09.166 #20 NEW cov: 12565 ft: 14540 corp: 12/85b lim: 20 exec/s: 0 rss: 72Mb L: 14/14 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:09.166 #21 NEW cov: 12565 ft: 14586 corp: 13/99b lim: 20 exec/s: 0 rss: 73Mb L: 14/14 MS: 1 ChangeByte- 00:08:09.166 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:09.166 #22 NEW cov: 12588 ft: 14627 corp: 14/110b lim: 20 exec/s: 0 rss: 73Mb L: 11/14 MS: 1 ChangeBit- 00:08:09.166 #23 NEW cov: 12588 ft: 14633 corp: 15/116b lim: 20 exec/s: 0 rss: 73Mb L: 6/14 MS: 1 CopyPart- 00:08:09.166 [2024-12-05 12:50:12.443123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.166 [2024-12-05 12:50:12.443150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.424 #24 NEW cov: 12588 ft: 14707 corp: 16/130b lim: 20 exec/s: 24 rss: 73Mb L: 14/14 MS: 1 ChangeBinInt- 00:08:09.424 #25 NEW cov: 12588 ft: 14718 corp: 17/142b lim: 20 exec/s: 25 rss: 73Mb L: 12/14 MS: 1 CopyPart- 00:08:09.424 #26 NEW cov: 12588 ft: 14760 corp: 18/149b lim: 20 exec/s: 26 rss: 73Mb L: 7/14 MS: 1 EraseBytes- 00:08:09.424 [2024-12-05 12:50:12.623553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.424 [2024-12-05 12:50:12.623579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.424 #27 NEW cov: 12588 ft: 14776 corp: 19/157b lim: 20 exec/s: 27 rss: 73Mb L: 8/14 MS: 1 ChangeByte- 00:08:09.424 #28 NEW cov: 12588 ft: 14804 corp: 20/161b lim: 20 exec/s: 28 rss: 73Mb L: 4/14 MS: 1 CopyPart- 00:08:09.683 #29 NEW cov: 12588 ft: 14818 corp: 21/175b lim: 20 exec/s: 29 rss: 73Mb L: 14/14 MS: 1 PersAutoDict- DE: "\001\230\202\331W\015\330\004"- 00:08:09.683 #30 NEW cov: 12588 ft: 14898 corp: 22/181b lim: 20 exec/s: 30 rss: 73Mb L: 6/14 MS: 1 ChangeByte- 00:08:09.683 [2024-12-05 12:50:12.804060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.683 [2024-12-05 12:50:12.804086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.683 #31 NEW cov: 12588 ft: 14902 corp: 23/189b lim: 20 exec/s: 31 rss: 73Mb L: 8/14 MS: 1 InsertByte- 00:08:09.683 #32 NEW cov: 12588 ft: 14907 corp: 24/195b lim: 20 exec/s: 32 rss: 73Mb L: 6/14 MS: 1 ShuffleBytes- 00:08:09.683 #33 NEW cov: 12588 ft: 14950 corp: 25/206b lim: 20 exec/s: 33 rss: 73Mb L: 11/14 MS: 1 ChangeBit- 00:08:09.683 #34 NEW cov: 12588 ft: 14967 corp: 26/220b lim: 20 exec/s: 34 rss: 73Mb L: 14/14 MS: 1 CopyPart- 00:08:09.943 #35 NEW cov: 12588 ft: 15007 corp: 27/231b lim: 20 exec/s: 35 rss: 73Mb L: 11/14 MS: 1 ChangeByte- 00:08:09.943 #36 NEW cov: 12588 ft: 15044 corp: 28/236b lim: 20 exec/s: 36 rss: 73Mb L: 5/14 MS: 1 CopyPart- 00:08:09.943 #37 NEW cov: 12588 ft: 15054 corp: 29/240b lim: 20 exec/s: 37 rss: 73Mb L: 4/14 MS: 1 ChangeBit- 00:08:09.943 #42 NEW cov: 12588 ft: 15080 corp: 30/247b lim: 20 exec/s: 42 rss: 73Mb L: 7/14 MS: 5 InsertByte-EraseBytes-InsertByte-ShuffleBytes-CrossOver- 00:08:09.943 #43 NEW cov: 12588 ft: 15101 corp: 31/262b lim: 20 exec/s: 43 rss: 73Mb L: 15/15 MS: 1 InsertByte- 00:08:09.943 [2024-12-05 12:50:13.205468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.943 [2024-12-05 12:50:13.205496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.943 NEW_FUNC[1/1]: 0x15b72a8 in _nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3634 00:08:09.943 #44 NEW cov: 12632 ft: 15361 corp: 32/281b lim: 20 exec/s: 44 rss: 73Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:08:10.202 #45 NEW cov: 12632 ft: 15392 corp: 33/296b lim: 20 exec/s: 45 rss: 73Mb L: 15/19 MS: 1 InsertByte- 00:08:10.203 #46 NEW cov: 12632 ft: 15438 corp: 34/302b lim: 20 exec/s: 46 rss: 74Mb L: 6/19 MS: 1 CrossOver- 00:08:10.203 #47 NEW cov: 12632 ft: 15469 corp: 35/308b lim: 20 exec/s: 47 rss: 74Mb L: 6/19 MS: 1 ChangeBinInt- 00:08:10.203 #48 NEW cov: 12632 ft: 15479 corp: 36/319b lim: 20 exec/s: 24 rss: 74Mb L: 11/19 MS: 1 ChangeBit- 00:08:10.203 #48 DONE cov: 12632 ft: 15479 corp: 36/319b lim: 20 exec/s: 24 rss: 74Mb 00:08:10.203 ###### Recommended dictionary. ###### 00:08:10.203 "\001\230\202\331W\015\330\004" # Uses: 1 00:08:10.203 ###### End of recommended dictionary. ###### 00:08:10.203 Done 48 runs in 2 second(s) 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:10.463 12:50:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:10.463 [2024-12-05 12:50:13.589819] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:10.463 [2024-12-05 12:50:13.589919] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid150467 ] 00:08:10.723 [2024-12-05 12:50:13.788648] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.723 [2024-12-05 12:50:13.800994] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.723 [2024-12-05 12:50:13.853322] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.723 [2024-12-05 12:50:13.869662] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:10.723 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.723 INFO: Seed: 1048546604 00:08:10.723 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:10.723 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:10.723 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:10.723 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.723 #2 INITED exec/s: 0 rss: 65Mb 00:08:10.723 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.723 This may also happen if the target rejected all inputs we tried so far 00:08:10.723 [2024-12-05 12:50:13.945721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2e7026dd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.723 [2024-12-05 12:50:13.945759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.983 NEW_FUNC[1/717]: 0x458938 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:10.983 NEW_FUNC[2/717]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.983 #7 NEW cov: 12151 ft: 12140 corp: 2/13b lim: 35 exec/s: 0 rss: 73Mb L: 12/12 MS: 5 CopyPart-InsertByte-InsertByte-InsertByte-CMP- DE: "p\000\000\000\000\000\000\000"- 00:08:10.983 [2024-12-05 12:50:14.276788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.983 [2024-12-05 12:50:14.276844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.243 #8 NEW cov: 12281 ft: 12643 corp: 3/21b lim: 35 exec/s: 0 rss: 73Mb L: 8/12 MS: 1 InsertRepeatedBytes- 00:08:11.243 [2024-12-05 12:50:14.316612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2e702cdd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.316639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.243 #9 NEW cov: 12287 ft: 12988 corp: 4/33b lim: 35 exec/s: 0 rss: 73Mb L: 12/12 MS: 1 ChangeByte- 00:08:11.243 [2024-12-05 12:50:14.376785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2e7026dd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.376811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.243 #10 NEW cov: 12372 ft: 13247 corp: 5/45b lim: 35 exec/s: 0 rss: 73Mb L: 12/12 MS: 1 ChangeByte- 00:08:11.243 [2024-12-05 12:50:14.417666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.417693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.243 [2024-12-05 12:50:14.417810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.417828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.243 [2024-12-05 12:50:14.417947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.417962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.243 [2024-12-05 12:50:14.418089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:95959595 cdw11:95950000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.418105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.243 #15 NEW cov: 12372 ft: 14109 corp: 6/73b lim: 35 exec/s: 0 rss: 73Mb L: 28/28 MS: 5 ShuffleBytes-ShuffleBytes-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:11.243 [2024-12-05 12:50:14.456990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00009b70 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.457019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.243 #18 NEW cov: 12372 ft: 14258 corp: 7/82b lim: 35 exec/s: 0 rss: 73Mb L: 9/28 MS: 3 ChangeByte-ChangeByte-PersAutoDict- DE: "p\000\000\000\000\000\000\000"- 00:08:11.243 [2024-12-05 12:50:14.497638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:95950a00 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.497666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.243 [2024-12-05 12:50:14.497778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.497795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.243 [2024-12-05 12:50:14.497919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:95959595 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.243 [2024-12-05 12:50:14.497935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.243 #19 NEW cov: 12372 ft: 14516 corp: 8/106b lim: 35 exec/s: 0 rss: 73Mb L: 24/28 MS: 1 CrossOver- 00:08:11.503 [2024-12-05 12:50:14.557330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002cdd cdw11:000d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.503 [2024-12-05 12:50:14.557358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.503 #20 NEW cov: 12372 ft: 14570 corp: 9/118b lim: 35 exec/s: 0 rss: 73Mb L: 12/28 MS: 1 CMP- DE: "\000\000\000\015"- 00:08:11.503 [2024-12-05 12:50:14.617502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:dd2e3d26 cdw11:70000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.503 [2024-12-05 12:50:14.617531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.504 #21 NEW cov: 12372 ft: 14597 corp: 10/131b lim: 35 exec/s: 0 rss: 73Mb L: 13/28 MS: 1 InsertByte- 00:08:11.504 [2024-12-05 12:50:14.677729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00009b70 cdw11:00500000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-12-05 12:50:14.677757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.504 #22 NEW cov: 12372 ft: 14659 corp: 11/140b lim: 35 exec/s: 0 rss: 73Mb L: 9/28 MS: 1 ChangeByte- 00:08:11.504 [2024-12-05 12:50:14.737804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00400a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-12-05 12:50:14.737837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.504 #23 NEW cov: 12372 ft: 14684 corp: 12/148b lim: 35 exec/s: 0 rss: 73Mb L: 8/28 MS: 1 ChangeBit- 00:08:11.504 [2024-12-05 12:50:14.787957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:007c9b70 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-12-05 12:50:14.787985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.504 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:11.504 #24 NEW cov: 12395 ft: 14751 corp: 13/157b lim: 35 exec/s: 0 rss: 73Mb L: 9/28 MS: 1 ChangeByte- 00:08:11.764 [2024-12-05 12:50:14.838933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.838960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 [2024-12-05 12:50:14.839085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.839101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.764 [2024-12-05 12:50:14.839219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.839235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.764 [2024-12-05 12:50:14.839361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.839378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.764 #25 NEW cov: 12395 ft: 14783 corp: 14/189b lim: 35 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:11.764 [2024-12-05 12:50:14.898371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.898399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 #26 NEW cov: 12395 ft: 14795 corp: 15/197b lim: 35 exec/s: 26 rss: 74Mb L: 8/32 MS: 1 ShuffleBytes- 00:08:11.764 [2024-12-05 12:50:14.938439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2e702cdd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.938468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 #27 NEW cov: 12395 ft: 14822 corp: 16/209b lim: 35 exec/s: 27 rss: 74Mb L: 12/32 MS: 1 ChangeBinInt- 00:08:11.764 [2024-12-05 12:50:14.979040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:24950a00 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.979067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 [2024-12-05 12:50:14.979193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.979209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.764 [2024-12-05 12:50:14.979331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:95959595 cdw11:95000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:14.979348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.764 #28 NEW cov: 12395 ft: 14884 corp: 17/234b lim: 35 exec/s: 28 rss: 74Mb L: 25/32 MS: 1 InsertByte- 00:08:11.764 [2024-12-05 12:50:15.039516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:15.039543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 [2024-12-05 12:50:15.039662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:15.039680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.764 [2024-12-05 12:50:15.039806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:15.039825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.764 [2024-12-05 12:50:15.039945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-12-05 12:50:15.039963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.031 #29 NEW cov: 12395 ft: 14917 corp: 18/266b lim: 35 exec/s: 29 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:12.031 [2024-12-05 12:50:15.098853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00400a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.031 [2024-12-05 12:50:15.098881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.031 #30 NEW cov: 12395 ft: 14935 corp: 19/274b lim: 35 exec/s: 30 rss: 74Mb L: 8/32 MS: 1 ShuffleBytes- 00:08:12.031 [2024-12-05 12:50:15.159055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002cdd cdw11:000a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.031 [2024-12-05 12:50:15.159083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.031 #32 NEW cov: 12395 ft: 14951 corp: 20/283b lim: 35 exec/s: 32 rss: 74Mb L: 9/32 MS: 2 EraseBytes-CopyPart- 00:08:12.031 [2024-12-05 12:50:15.210022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:24950a00 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.031 [2024-12-05 12:50:15.210050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.031 [2024-12-05 12:50:15.210172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.031 [2024-12-05 12:50:15.210190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.031 [2024-12-05 12:50:15.210310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:95950095 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.031 [2024-12-05 12:50:15.210326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.031 [2024-12-05 12:50:15.210446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:95959595 cdw11:95950000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.031 [2024-12-05 12:50:15.210463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.031 #33 NEW cov: 12395 ft: 14983 corp: 21/316b lim: 35 exec/s: 33 rss: 74Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:12.031 [2024-12-05 12:50:15.269305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:dd2e3d26 cdw11:70000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.031 [2024-12-05 12:50:15.269332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.031 #34 NEW cov: 12395 ft: 15011 corp: 22/329b lim: 35 exec/s: 34 rss: 74Mb L: 13/33 MS: 1 ChangeByte- 00:08:12.031 [2024-12-05 12:50:15.329483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2e702cdd cdw11:00fb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.031 [2024-12-05 12:50:15.329508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.290 #35 NEW cov: 12395 ft: 15035 corp: 23/341b lim: 35 exec/s: 35 rss: 74Mb L: 12/33 MS: 1 ChangeBinInt- 00:08:12.290 [2024-12-05 12:50:15.389622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00009b70 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.290 [2024-12-05 12:50:15.389651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.290 #36 NEW cov: 12395 ft: 15043 corp: 24/350b lim: 35 exec/s: 36 rss: 74Mb L: 9/33 MS: 1 CopyPart- 00:08:12.290 [2024-12-05 12:50:15.429740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2e702cdd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.290 [2024-12-05 12:50:15.429767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.290 #37 NEW cov: 12395 ft: 15055 corp: 25/362b lim: 35 exec/s: 37 rss: 74Mb L: 12/33 MS: 1 ChangeBit- 00:08:12.290 [2024-12-05 12:50:15.469882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2e702cdd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.290 [2024-12-05 12:50:15.469907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.290 #38 NEW cov: 12395 ft: 15094 corp: 26/374b lim: 35 exec/s: 38 rss: 74Mb L: 12/33 MS: 1 ChangeBit- 00:08:12.290 [2024-12-05 12:50:15.510032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00009b70 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.290 [2024-12-05 12:50:15.510059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.290 #39 NEW cov: 12395 ft: 15103 corp: 27/384b lim: 35 exec/s: 39 rss: 74Mb L: 10/33 MS: 1 CopyPart- 00:08:12.290 [2024-12-05 12:50:15.570163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002cdd cdw11:00070003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.290 [2024-12-05 12:50:15.570188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.550 #40 NEW cov: 12395 ft: 15184 corp: 28/393b lim: 35 exec/s: 40 rss: 74Mb L: 9/33 MS: 1 ChangeBinInt- 00:08:12.550 [2024-12-05 12:50:15.630339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2b2b9b2b cdw11:70000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.630366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.550 #41 NEW cov: 12395 ft: 15190 corp: 29/406b lim: 35 exec/s: 41 rss: 75Mb L: 13/33 MS: 1 InsertRepeatedBytes- 00:08:12.550 [2024-12-05 12:50:15.690753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2e7026dd cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.690778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.550 [2024-12-05 12:50:15.690905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:32000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.690923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.550 #42 NEW cov: 12395 ft: 15417 corp: 30/422b lim: 35 exec/s: 42 rss: 75Mb L: 16/33 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:12.550 [2024-12-05 12:50:15.730653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000c9b70 cdw11:00500000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.730681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.550 #43 NEW cov: 12395 ft: 15458 corp: 31/431b lim: 35 exec/s: 43 rss: 75Mb L: 9/33 MS: 1 CMP- DE: "\000\014"- 00:08:12.550 [2024-12-05 12:50:15.790821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000c9b70 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.790852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.550 #44 NEW cov: 12395 ft: 15512 corp: 32/444b lim: 35 exec/s: 44 rss: 75Mb L: 13/33 MS: 1 CopyPart- 00:08:12.550 [2024-12-05 12:50:15.851711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.851737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.550 [2024-12-05 12:50:15.851863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.851880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.550 [2024-12-05 12:50:15.852001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:95954195 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.852018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.550 [2024-12-05 12:50:15.852141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.550 [2024-12-05 12:50:15.852157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.810 #45 NEW cov: 12395 ft: 15518 corp: 33/473b lim: 35 exec/s: 45 rss: 75Mb L: 29/33 MS: 1 InsertByte- 00:08:12.810 [2024-12-05 12:50:15.891065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00006070 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.810 [2024-12-05 12:50:15.891092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.810 #46 NEW cov: 12395 ft: 15534 corp: 34/483b lim: 35 exec/s: 46 rss: 75Mb L: 10/33 MS: 1 ChangeByte- 00:08:12.810 [2024-12-05 12:50:15.931719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:24950a00 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.810 [2024-12-05 12:50:15.931745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.810 [2024-12-05 12:50:15.931877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:95959595 cdw11:95950001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.810 [2024-12-05 12:50:15.931895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.810 [2024-12-05 12:50:15.932004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:95959595 cdw11:95000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.810 [2024-12-05 12:50:15.932019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.810 #47 NEW cov: 12395 ft: 15540 corp: 35/510b lim: 35 exec/s: 23 rss: 75Mb L: 27/33 MS: 1 PersAutoDict- DE: "\000\014"- 00:08:12.810 #47 DONE cov: 12395 ft: 15540 corp: 35/510b lim: 35 exec/s: 23 rss: 75Mb 00:08:12.810 ###### Recommended dictionary. ###### 00:08:12.810 "p\000\000\000\000\000\000\000" # Uses: 1 00:08:12.810 "\000\000\000\015" # Uses: 0 00:08:12.810 "\377\377\377\377" # Uses: 0 00:08:12.810 "\000\014" # Uses: 1 00:08:12.810 ###### End of recommended dictionary. ###### 00:08:12.810 Done 47 runs in 2 second(s) 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:12.810 12:50:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:12.810 [2024-12-05 12:50:16.099954] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:12.810 [2024-12-05 12:50:16.100024] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid150830 ] 00:08:13.070 [2024-12-05 12:50:16.309178] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.070 [2024-12-05 12:50:16.322318] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.070 [2024-12-05 12:50:16.374739] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.330 [2024-12-05 12:50:16.391056] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:13.330 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.330 INFO: Seed: 3569543189 00:08:13.330 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:13.330 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:13.330 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:13.330 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.330 #2 INITED exec/s: 0 rss: 65Mb 00:08:13.330 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.330 This may also happen if the target rejected all inputs we tried so far 00:08:13.330 [2024-12-05 12:50:16.457922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00007a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.330 [2024-12-05 12:50:16.457958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.330 [2024-12-05 12:50:16.458087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.330 [2024-12-05 12:50:16.458106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.330 [2024-12-05 12:50:16.458220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.330 [2024-12-05 12:50:16.458237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.330 [2024-12-05 12:50:16.458354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.330 [2024-12-05 12:50:16.458373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.590 NEW_FUNC[1/717]: 0x45aad8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:13.590 NEW_FUNC[2/717]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.590 #9 NEW cov: 12180 ft: 12181 corp: 2/43b lim: 45 exec/s: 0 rss: 73Mb L: 42/42 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:13.590 [2024-12-05 12:50:16.809053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.590 [2024-12-05 12:50:16.809096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.590 [2024-12-05 12:50:16.809225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.590 [2024-12-05 12:50:16.809244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.590 [2024-12-05 12:50:16.809367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.590 [2024-12-05 12:50:16.809388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.590 [2024-12-05 12:50:16.809510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.590 [2024-12-05 12:50:16.809530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.590 #13 NEW cov: 12293 ft: 12783 corp: 3/85b lim: 45 exec/s: 0 rss: 73Mb L: 42/42 MS: 4 CopyPart-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:13.590 [2024-12-05 12:50:16.859124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00807a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.590 [2024-12-05 12:50:16.859155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.590 [2024-12-05 12:50:16.859273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.590 [2024-12-05 12:50:16.859293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.590 [2024-12-05 12:50:16.859426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.590 [2024-12-05 12:50:16.859443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.590 [2024-12-05 12:50:16.859565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.590 [2024-12-05 12:50:16.859585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.590 #14 NEW cov: 12299 ft: 12955 corp: 4/127b lim: 45 exec/s: 0 rss: 73Mb L: 42/42 MS: 1 ChangeBit- 00:08:13.851 [2024-12-05 12:50:16.928502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:16.928533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.851 #16 NEW cov: 12384 ft: 14105 corp: 5/138b lim: 45 exec/s: 0 rss: 73Mb L: 11/42 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:13.851 [2024-12-05 12:50:16.979352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:16.979379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:16.979498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:16.979517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:16.979645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:16.979663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:16.979779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:16.979796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.851 #17 NEW cov: 12384 ft: 14208 corp: 6/180b lim: 45 exec/s: 0 rss: 73Mb L: 42/42 MS: 1 ShuffleBytes- 00:08:13.851 [2024-12-05 12:50:17.049573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:17.049599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:17.049724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:17.049742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:17.049877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:17.049893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:17.050019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:17.050036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.851 #18 NEW cov: 12384 ft: 14272 corp: 7/222b lim: 45 exec/s: 0 rss: 73Mb L: 42/42 MS: 1 ChangeBinInt- 00:08:13.851 [2024-12-05 12:50:17.119759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:17.119786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:17.119904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:17.119921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:17.120044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:17.120063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.851 [2024-12-05 12:50:17.120194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b8d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.851 [2024-12-05 12:50:17.120210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.851 #19 NEW cov: 12384 ft: 14339 corp: 8/264b lim: 45 exec/s: 0 rss: 73Mb L: 42/42 MS: 1 ChangeByte- 00:08:14.111 [2024-12-05 12:50:17.170048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.111 [2024-12-05 12:50:17.170074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.111 [2024-12-05 12:50:17.170196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.111 [2024-12-05 12:50:17.170215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.111 [2024-12-05 12:50:17.170334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.111 [2024-12-05 12:50:17.170350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.111 [2024-12-05 12:50:17.170468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:7ed3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.111 [2024-12-05 12:50:17.170484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.111 #20 NEW cov: 12384 ft: 14443 corp: 9/307b lim: 45 exec/s: 0 rss: 73Mb L: 43/43 MS: 1 InsertByte- 00:08:14.111 [2024-12-05 12:50:17.220162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00007a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.220188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.220306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.220323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.220444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.220460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.220585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.220600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.112 #21 NEW cov: 12384 ft: 14491 corp: 10/351b lim: 45 exec/s: 0 rss: 73Mb L: 44/44 MS: 1 CopyPart- 00:08:14.112 [2024-12-05 12:50:17.270350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.270377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.270498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.270514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.270634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.270650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.270773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b8d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.270788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.112 #22 NEW cov: 12384 ft: 14548 corp: 11/393b lim: 45 exec/s: 0 rss: 73Mb L: 42/44 MS: 1 ChangeBinInt- 00:08:14.112 [2024-12-05 12:50:17.340563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.340591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.340712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.340730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.340855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.340872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.112 [2024-12-05 12:50:17.340995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b83fd3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.341010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.112 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:14.112 #23 NEW cov: 12401 ft: 14573 corp: 12/436b lim: 45 exec/s: 0 rss: 73Mb L: 43/44 MS: 1 InsertByte- 00:08:14.112 [2024-12-05 12:50:17.389861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.112 [2024-12-05 12:50:17.389887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.112 #25 NEW cov: 12401 ft: 14607 corp: 13/446b lim: 45 exec/s: 0 rss: 73Mb L: 10/44 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:14.372 [2024-12-05 12:50:17.440858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.440886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.372 [2024-12-05 12:50:17.441010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.441028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.372 [2024-12-05 12:50:17.441142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.441159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.372 [2024-12-05 12:50:17.441279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3b8d3d3 cdw11:d33f0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.441298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.372 #26 NEW cov: 12401 ft: 14641 corp: 14/489b lim: 45 exec/s: 26 rss: 73Mb L: 43/44 MS: 1 ShuffleBytes- 00:08:14.372 [2024-12-05 12:50:17.510481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.510508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.372 [2024-12-05 12:50:17.510634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.510651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.372 #29 NEW cov: 12401 ft: 14933 corp: 15/513b lim: 45 exec/s: 29 rss: 74Mb L: 24/44 MS: 3 EraseBytes-ShuffleBytes-CrossOver- 00:08:14.372 [2024-12-05 12:50:17.581405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00007a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.581432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.372 [2024-12-05 12:50:17.581550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.581567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.372 [2024-12-05 12:50:17.581688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.581704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.372 [2024-12-05 12:50:17.581820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.581837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.372 #30 NEW cov: 12401 ft: 14993 corp: 16/557b lim: 45 exec/s: 30 rss: 74Mb L: 44/44 MS: 1 ShuffleBytes- 00:08:14.372 [2024-12-05 12:50:17.651461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.372 [2024-12-05 12:50:17.651486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.372 [2024-12-05 12:50:17.651601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.373 [2024-12-05 12:50:17.651627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.373 [2024-12-05 12:50:17.651751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:66d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.373 [2024-12-05 12:50:17.651767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.373 [2024-12-05 12:50:17.651888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3b8d3d3 cdw11:d33f0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.373 [2024-12-05 12:50:17.651905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.632 #31 NEW cov: 12401 ft: 15005 corp: 17/600b lim: 45 exec/s: 31 rss: 74Mb L: 43/44 MS: 1 ChangeByte- 00:08:14.632 [2024-12-05 12:50:17.721744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.721775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.721897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.721915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.722034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.722049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.722170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.722187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.632 #32 NEW cov: 12401 ft: 15061 corp: 18/642b lim: 45 exec/s: 32 rss: 74Mb L: 42/44 MS: 1 ShuffleBytes- 00:08:14.632 [2024-12-05 12:50:17.791928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.791955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.792082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d32d0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.792099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.792221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:66d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.792239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.792357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3b8d3d3 cdw11:d33f0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.792375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.632 #33 NEW cov: 12401 ft: 15112 corp: 19/685b lim: 45 exec/s: 33 rss: 74Mb L: 43/44 MS: 1 ChangeBinInt- 00:08:14.632 [2024-12-05 12:50:17.862154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.862180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.862302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.862322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.862447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.632 [2024-12-05 12:50:17.862464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.632 [2024-12-05 12:50:17.862583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.633 [2024-12-05 12:50:17.862602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.633 #34 NEW cov: 12401 ft: 15124 corp: 20/727b lim: 45 exec/s: 34 rss: 74Mb L: 42/44 MS: 1 ShuffleBytes- 00:08:14.633 [2024-12-05 12:50:17.912552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00007a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.633 [2024-12-05 12:50:17.912580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.633 [2024-12-05 12:50:17.912708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.633 [2024-12-05 12:50:17.912725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.633 [2024-12-05 12:50:17.912844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.633 [2024-12-05 12:50:17.912862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.633 [2024-12-05 12:50:17.912984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.633 [2024-12-05 12:50:17.913000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.633 [2024-12-05 12:50:17.913122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.633 [2024-12-05 12:50:17.913139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.633 #35 NEW cov: 12401 ft: 15200 corp: 21/772b lim: 45 exec/s: 35 rss: 74Mb L: 45/45 MS: 1 CopyPart- 00:08:14.893 [2024-12-05 12:50:17.962434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:17.962462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:17.962587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:17.962605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:17.962730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:17.962748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:17.962873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3b8d3d3 cdw11:d33f0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:17.962890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.893 #36 NEW cov: 12401 ft: 15212 corp: 22/815b lim: 45 exec/s: 36 rss: 74Mb L: 43/45 MS: 1 ChangeBinInt- 00:08:14.893 [2024-12-05 12:50:18.012651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.012678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.012796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.012817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.012940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.012957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.013074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d32cd3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.013093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.893 #37 NEW cov: 12401 ft: 15218 corp: 23/857b lim: 45 exec/s: 37 rss: 74Mb L: 42/45 MS: 1 ChangeBinInt- 00:08:14.893 [2024-12-05 12:50:18.062789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.062817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.062940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.062972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.063091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:2c352d2c cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.063108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.063231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.063247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.893 #38 NEW cov: 12401 ft: 15234 corp: 24/899b lim: 45 exec/s: 38 rss: 74Mb L: 42/45 MS: 1 ChangeBinInt- 00:08:14.893 [2024-12-05 12:50:18.132153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.132182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.893 #39 NEW cov: 12401 ft: 15271 corp: 25/913b lim: 45 exec/s: 39 rss: 74Mb L: 14/45 MS: 1 CMP- DE: "\001\000\000\034"- 00:08:14.893 [2024-12-05 12:50:18.203282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.203310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.203434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.203450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.203571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.203589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.893 [2024-12-05 12:50:18.203717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3b8d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.893 [2024-12-05 12:50:18.203738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.154 #40 NEW cov: 12401 ft: 15296 corp: 26/955b lim: 45 exec/s: 40 rss: 74Mb L: 42/45 MS: 1 ShuffleBytes- 00:08:15.154 [2024-12-05 12:50:18.273372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.273400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.273515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.273535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.273659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.273678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.273798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.273815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.154 #41 NEW cov: 12401 ft: 15302 corp: 27/997b lim: 45 exec/s: 41 rss: 74Mb L: 42/45 MS: 1 ChangeByte- 00:08:15.154 [2024-12-05 12:50:18.322723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.322750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.154 #42 NEW cov: 12408 ft: 15333 corp: 28/1007b lim: 45 exec/s: 42 rss: 74Mb L: 10/45 MS: 1 CopyPart- 00:08:15.154 [2024-12-05 12:50:18.373616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.373644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.373777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.373793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.373933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.373951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.374078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3b8d3d3 cdw11:3fd30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.374097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.154 #43 NEW cov: 12408 ft: 15341 corp: 29/1051b lim: 45 exec/s: 43 rss: 74Mb L: 44/45 MS: 1 CopyPart- 00:08:15.154 [2024-12-05 12:50:18.423737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.423763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.423902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d3d3d3d3 cdw11:d3d30001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.423922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.424037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d36635d3 cdw11:d3d30006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.424053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.154 [2024-12-05 12:50:18.424177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d3d3d3d3 cdw11:b8d30001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.154 [2024-12-05 12:50:18.424194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.154 #44 NEW cov: 12408 ft: 15372 corp: 30/1095b lim: 45 exec/s: 22 rss: 74Mb L: 44/45 MS: 1 InsertByte- 00:08:15.154 #44 DONE cov: 12408 ft: 15372 corp: 30/1095b lim: 45 exec/s: 22 rss: 74Mb 00:08:15.154 ###### Recommended dictionary. ###### 00:08:15.154 "\001\000\000\034" # Uses: 0 00:08:15.154 ###### End of recommended dictionary. ###### 00:08:15.154 Done 44 runs in 2 second(s) 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:15.414 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:15.415 12:50:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:15.415 [2024-12-05 12:50:18.608382] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:15.415 [2024-12-05 12:50:18.608455] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151365 ] 00:08:15.674 [2024-12-05 12:50:18.805671] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.674 [2024-12-05 12:50:18.819034] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.674 [2024-12-05 12:50:18.871658] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.674 [2024-12-05 12:50:18.887942] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:15.674 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.674 INFO: Seed: 1769587721 00:08:15.674 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:15.674 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:15.674 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:15.674 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.674 #2 INITED exec/s: 0 rss: 64Mb 00:08:15.674 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.674 This may also happen if the target rejected all inputs we tried so far 00:08:15.674 [2024-12-05 12:50:18.957584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:15.674 [2024-12-05 12:50:18.957618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.192 NEW_FUNC[1/715]: 0x45d2e8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:16.192 NEW_FUNC[2/715]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.192 #4 NEW cov: 12097 ft: 12096 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 ShuffleBytes-CrossOver- 00:08:16.192 [2024-12-05 12:50:19.298234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:16.192 [2024-12-05 12:50:19.298273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.192 #7 NEW cov: 12210 ft: 12747 corp: 3/6b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 3 EraseBytes-ChangeBit-CMP- DE: "\001\000"- 00:08:16.192 [2024-12-05 12:50:19.368363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a1a cdw11:00000000 00:08:16.192 [2024-12-05 12:50:19.368394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.192 #8 NEW cov: 12216 ft: 13056 corp: 4/8b lim: 10 exec/s: 0 rss: 72Mb L: 2/3 MS: 1 CrossOver- 00:08:16.192 [2024-12-05 12:50:19.418485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000000 00:08:16.192 [2024-12-05 12:50:19.418512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.192 #9 NEW cov: 12301 ft: 13313 corp: 5/11b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 ChangeBit- 00:08:16.192 [2024-12-05 12:50:19.478684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000300 cdw11:00000000 00:08:16.192 [2024-12-05 12:50:19.478711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.192 #10 NEW cov: 12301 ft: 13459 corp: 6/14b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 ChangeBinInt- 00:08:16.451 [2024-12-05 12:50:19.529380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.529407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.529523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.529541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.529658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.529680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.529794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.529809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.451 #11 NEW cov: 12301 ft: 13810 corp: 7/23b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:16.451 [2024-12-05 12:50:19.579525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.579551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.579662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000089ff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.579680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.579801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.579818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.579937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.579953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.451 #12 NEW cov: 12301 ft: 13881 corp: 8/32b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 ChangeByte- 00:08:16.451 [2024-12-05 12:50:19.649169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000180 cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.649197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.451 #13 NEW cov: 12301 ft: 13913 corp: 9/35b lim: 10 exec/s: 0 rss: 73Mb L: 3/9 MS: 1 ChangeBit- 00:08:16.451 [2024-12-05 12:50:19.699606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008201 cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.699634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.699749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000001a cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.699766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.451 #14 NEW cov: 12301 ft: 14103 corp: 10/39b lim: 10 exec/s: 0 rss: 73Mb L: 4/9 MS: 1 InsertByte- 00:08:16.451 [2024-12-05 12:50:19.750370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.750396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.750512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.750529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.750640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.750657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.750779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.750798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.451 [2024-12-05 12:50:19.750921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.451 [2024-12-05 12:50:19.750939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.710 #15 NEW cov: 12301 ft: 14191 corp: 11/49b lim: 10 exec/s: 0 rss: 73Mb L: 10/10 MS: 1 CrossOver- 00:08:16.710 [2024-12-05 12:50:19.799835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000810a cdw11:00000000 00:08:16.710 [2024-12-05 12:50:19.799862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.710 [2024-12-05 12:50:19.799973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:08:16.710 [2024-12-05 12:50:19.799989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.710 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:16.710 #16 NEW cov: 12324 ft: 14228 corp: 12/54b lim: 10 exec/s: 0 rss: 73Mb L: 5/10 MS: 1 CrossOver- 00:08:16.710 [2024-12-05 12:50:19.870074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000012e cdw11:00000000 00:08:16.710 [2024-12-05 12:50:19.870101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.710 [2024-12-05 12:50:19.870219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000001a cdw11:00000000 00:08:16.710 [2024-12-05 12:50:19.870235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.711 #17 NEW cov: 12324 ft: 14308 corp: 13/58b lim: 10 exec/s: 0 rss: 73Mb L: 4/10 MS: 1 InsertByte- 00:08:16.711 [2024-12-05 12:50:19.919960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:16.711 [2024-12-05 12:50:19.919987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.711 #18 NEW cov: 12324 ft: 14329 corp: 14/60b lim: 10 exec/s: 18 rss: 73Mb L: 2/10 MS: 1 CopyPart- 00:08:16.711 [2024-12-05 12:50:19.970135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e01 cdw11:00000000 00:08:16.711 [2024-12-05 12:50:19.970160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.711 #21 NEW cov: 12324 ft: 14339 corp: 15/63b lim: 10 exec/s: 21 rss: 73Mb L: 3/10 MS: 3 CrossOver-CopyPart-PersAutoDict- DE: "\001\000"- 00:08:16.711 [2024-12-05 12:50:20.020764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:16.711 [2024-12-05 12:50:20.020791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.711 [2024-12-05 12:50:20.020913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000089ff cdw11:00000000 00:08:16.711 [2024-12-05 12:50:20.020931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.711 [2024-12-05 12:50:20.021044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.711 [2024-12-05 12:50:20.021062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.971 #22 NEW cov: 12324 ft: 14525 corp: 16/70b lim: 10 exec/s: 22 rss: 73Mb L: 7/10 MS: 1 EraseBytes- 00:08:16.971 [2024-12-05 12:50:20.091095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000300 cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.091127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.971 [2024-12-05 12:50:20.091253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001aff cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.091269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.971 [2024-12-05 12:50:20.091391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.091406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.971 #23 NEW cov: 12324 ft: 14612 corp: 17/77b lim: 10 exec/s: 23 rss: 73Mb L: 7/10 MS: 1 CrossOver- 00:08:16.971 [2024-12-05 12:50:20.161037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.161064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.971 [2024-12-05 12:50:20.161183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.161212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.971 #25 NEW cov: 12324 ft: 14618 corp: 18/82b lim: 10 exec/s: 25 rss: 73Mb L: 5/10 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:16.971 [2024-12-05 12:50:20.211318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.211346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.971 [2024-12-05 12:50:20.211473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000012e cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.211492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.971 [2024-12-05 12:50:20.211625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000001a cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.211643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.971 #26 NEW cov: 12324 ft: 14644 corp: 19/88b lim: 10 exec/s: 26 rss: 73Mb L: 6/10 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:16.971 [2024-12-05 12:50:20.281321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000032e cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.281351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.971 [2024-12-05 12:50:20.281477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000001a cdw11:00000000 00:08:16.971 [2024-12-05 12:50:20.281494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.230 #27 NEW cov: 12324 ft: 14659 corp: 20/92b lim: 10 exec/s: 27 rss: 73Mb L: 4/10 MS: 1 ChangeBit- 00:08:17.230 [2024-12-05 12:50:20.321364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000012e cdw11:00000000 00:08:17.230 [2024-12-05 12:50:20.321392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.230 [2024-12-05 12:50:20.321520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000000a7 cdw11:00000000 00:08:17.230 [2024-12-05 12:50:20.321537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.230 #28 NEW cov: 12324 ft: 14666 corp: 21/96b lim: 10 exec/s: 28 rss: 73Mb L: 4/10 MS: 1 ChangeByte- 00:08:17.230 [2024-12-05 12:50:20.372224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a80 cdw11:00000000 00:08:17.230 [2024-12-05 12:50:20.372251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.230 [2024-12-05 12:50:20.372383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.230 [2024-12-05 12:50:20.372401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.230 [2024-12-05 12:50:20.372522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.230 [2024-12-05 12:50:20.372543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.230 [2024-12-05 12:50:20.372655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff73 cdw11:00000000 00:08:17.230 [2024-12-05 12:50:20.372671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.230 [2024-12-05 12:50:20.372782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.230 [2024-12-05 12:50:20.372800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.230 #29 NEW cov: 12324 ft: 14706 corp: 22/106b lim: 10 exec/s: 29 rss: 73Mb L: 10/10 MS: 1 ChangeByte- 00:08:17.230 [2024-12-05 12:50:20.441720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000011a cdw11:00000000 00:08:17.230 [2024-12-05 12:50:20.441747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.231 #30 NEW cov: 12324 ft: 14726 corp: 23/108b lim: 10 exec/s: 30 rss: 73Mb L: 2/10 MS: 1 EraseBytes- 00:08:17.231 [2024-12-05 12:50:20.492350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:17.231 [2024-12-05 12:50:20.492376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.231 [2024-12-05 12:50:20.492495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.231 [2024-12-05 12:50:20.492512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.231 [2024-12-05 12:50:20.492626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.231 [2024-12-05 12:50:20.492644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.231 [2024-12-05 12:50:20.492757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002e00 cdw11:00000000 00:08:17.231 [2024-12-05 12:50:20.492774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.231 #31 NEW cov: 12324 ft: 14735 corp: 24/117b lim: 10 exec/s: 31 rss: 73Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:08:17.231 [2024-12-05 12:50:20.542526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000173 cdw11:00000000 00:08:17.231 [2024-12-05 12:50:20.542558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.231 [2024-12-05 12:50:20.542678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.231 [2024-12-05 12:50:20.542698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.231 [2024-12-05 12:50:20.542812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 00:08:17.231 [2024-12-05 12:50:20.542830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.231 [2024-12-05 12:50:20.542953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002e00 cdw11:00000000 00:08:17.231 [2024-12-05 12:50:20.542969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.490 #32 NEW cov: 12324 ft: 14760 corp: 25/126b lim: 10 exec/s: 32 rss: 73Mb L: 9/10 MS: 1 CrossOver- 00:08:17.490 [2024-12-05 12:50:20.612295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.612321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.490 [2024-12-05 12:50:20.612450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.612467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.490 #33 NEW cov: 12324 ft: 14763 corp: 26/131b lim: 10 exec/s: 33 rss: 73Mb L: 5/10 MS: 1 CrossOver- 00:08:17.490 [2024-12-05 12:50:20.672948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000003ff cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.672975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.490 [2024-12-05 12:50:20.673102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.673116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.490 [2024-12-05 12:50:20.673229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.673246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.490 [2024-12-05 12:50:20.673363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000001a cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.673378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.490 #34 NEW cov: 12324 ft: 14787 corp: 27/139b lim: 10 exec/s: 34 rss: 74Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:08:17.490 [2024-12-05 12:50:20.743170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.743196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.490 [2024-12-05 12:50:20.743312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.743328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.490 [2024-12-05 12:50:20.743439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.743455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.490 [2024-12-05 12:50:20.743571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.490 [2024-12-05 12:50:20.743587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.490 #35 NEW cov: 12324 ft: 14817 corp: 28/148b lim: 10 exec/s: 35 rss: 74Mb L: 9/10 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:17.750 [2024-12-05 12:50:20.813332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000115 cdw11:00000000 00:08:17.750 [2024-12-05 12:50:20.813374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.750 [2024-12-05 12:50:20.813494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.750 [2024-12-05 12:50:20.813512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.750 [2024-12-05 12:50:20.813630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.750 [2024-12-05 12:50:20.813645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.750 [2024-12-05 12:50:20.813764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:17.750 [2024-12-05 12:50:20.813781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.750 #36 NEW cov: 12324 ft: 14852 corp: 29/157b lim: 10 exec/s: 36 rss: 74Mb L: 9/10 MS: 1 CMP- DE: "\001\025"- 00:08:17.750 [2024-12-05 12:50:20.882958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000012e cdw11:00000000 00:08:17.750 [2024-12-05 12:50:20.882988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.750 #37 NEW cov: 12324 ft: 14855 corp: 30/160b lim: 10 exec/s: 37 rss: 74Mb L: 3/10 MS: 1 ShuffleBytes- 00:08:17.750 [2024-12-05 12:50:20.953197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a1a cdw11:00000000 00:08:17.750 [2024-12-05 12:50:20.953224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.750 #38 NEW cov: 12324 ft: 14886 corp: 31/163b lim: 10 exec/s: 19 rss: 74Mb L: 3/10 MS: 1 InsertByte- 00:08:17.750 #38 DONE cov: 12324 ft: 14886 corp: 31/163b lim: 10 exec/s: 19 rss: 74Mb 00:08:17.750 ###### Recommended dictionary. ###### 00:08:17.750 "\001\000" # Uses: 2 00:08:17.750 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:17.750 "\001\025" # Uses: 0 00:08:17.750 ###### End of recommended dictionary. ###### 00:08:17.750 Done 38 runs in 2 second(s) 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.010 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.011 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:08:18.011 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:08:18.011 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:18.011 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:18.011 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.011 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.011 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.011 12:50:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:18.011 [2024-12-05 12:50:21.141891] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:18.011 [2024-12-05 12:50:21.141979] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid151757 ] 00:08:18.271 [2024-12-05 12:50:21.344161] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.271 [2024-12-05 12:50:21.358099] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.271 [2024-12-05 12:50:21.410962] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.271 [2024-12-05 12:50:21.427274] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:18.271 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.271 INFO: Seed: 16624011 00:08:18.271 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:18.271 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:18.271 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:18.271 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.271 #2 INITED exec/s: 0 rss: 64Mb 00:08:18.271 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.271 This may also happen if the target rejected all inputs we tried so far 00:08:18.271 [2024-12-05 12:50:21.493720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:18.271 [2024-12-05 12:50:21.493758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.271 [2024-12-05 12:50:21.493886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:18.271 [2024-12-05 12:50:21.493904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.271 [2024-12-05 12:50:21.494029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d40a cdw11:00000000 00:08:18.271 [2024-12-05 12:50:21.494047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.531 NEW_FUNC[1/715]: 0x45dce8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:18.531 NEW_FUNC[2/715]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.531 #3 NEW cov: 12097 ft: 12098 corp: 2/7b lim: 10 exec/s: 0 rss: 71Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:08:18.531 [2024-12-05 12:50:21.824488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:18.531 [2024-12-05 12:50:21.824536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.791 #4 NEW cov: 12210 ft: 12946 corp: 3/9b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 CopyPart- 00:08:18.792 [2024-12-05 12:50:21.874584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:18.792 [2024-12-05 12:50:21.874611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.792 #5 NEW cov: 12216 ft: 13271 corp: 4/11b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 ShuffleBytes- 00:08:18.792 [2024-12-05 12:50:21.944803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:08:18.792 [2024-12-05 12:50:21.944830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.792 #8 NEW cov: 12301 ft: 13524 corp: 5/13b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 3 ShuffleBytes-ChangeBit-CrossOver- 00:08:18.792 [2024-12-05 12:50:21.994956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a6f cdw11:00000000 00:08:18.792 [2024-12-05 12:50:21.994985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.792 #9 NEW cov: 12301 ft: 13616 corp: 6/15b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 InsertByte- 00:08:18.792 [2024-12-05 12:50:22.045295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a67 cdw11:00000000 00:08:18.792 [2024-12-05 12:50:22.045322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.792 #10 NEW cov: 12301 ft: 13690 corp: 7/17b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 ChangeBit- 00:08:19.053 [2024-12-05 12:50:22.115974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.116001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.053 [2024-12-05 12:50:22.116135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dcd4 cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.116154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.053 [2024-12-05 12:50:22.116300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d40a cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.116319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.053 #11 NEW cov: 12301 ft: 13799 corp: 8/23b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 ChangeBit- 00:08:19.053 [2024-12-05 12:50:22.185757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002f0a cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.185784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.053 #12 NEW cov: 12301 ft: 13836 corp: 9/25b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 ChangeByte- 00:08:19.053 [2024-12-05 12:50:22.255974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.256002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.053 #13 NEW cov: 12301 ft: 13861 corp: 10/28b lim: 10 exec/s: 0 rss: 72Mb L: 3/6 MS: 1 CopyPart- 00:08:19.053 [2024-12-05 12:50:22.306638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.306665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.053 [2024-12-05 12:50:22.306796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.306814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.053 [2024-12-05 12:50:22.306956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000920a cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.306974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.053 #14 NEW cov: 12301 ft: 13918 corp: 11/34b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 ChangeByte- 00:08:19.053 [2024-12-05 12:50:22.356466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:19.053 [2024-12-05 12:50:22.356494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.313 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:19.313 #15 NEW cov: 12324 ft: 13998 corp: 12/36b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 CMP- DE: "\377\000"- 00:08:19.313 [2024-12-05 12:50:22.407080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:19.313 [2024-12-05 12:50:22.407109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.313 [2024-12-05 12:50:22.407251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dcd4 cdw11:00000000 00:08:19.313 [2024-12-05 12:50:22.407268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.313 [2024-12-05 12:50:22.407403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000000d4 cdw11:00000000 00:08:19.313 [2024-12-05 12:50:22.407422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.313 #16 NEW cov: 12324 ft: 14100 corp: 13/43b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 InsertByte- 00:08:19.313 [2024-12-05 12:50:22.477259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:08:19.313 [2024-12-05 12:50:22.477289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.313 [2024-12-05 12:50:22.477420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:19.313 [2024-12-05 12:50:22.477439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.313 #17 NEW cov: 12324 ft: 14245 corp: 14/47b lim: 10 exec/s: 17 rss: 72Mb L: 4/7 MS: 1 CrossOver- 00:08:19.313 [2024-12-05 12:50:22.547914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:19.313 [2024-12-05 12:50:22.547943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.313 [2024-12-05 12:50:22.548097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:19.314 [2024-12-05 12:50:22.548116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.314 [2024-12-05 12:50:22.548252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:19.314 [2024-12-05 12:50:22.548271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.314 [2024-12-05 12:50:22.548403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a22 cdw11:00000000 00:08:19.314 [2024-12-05 12:50:22.548422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.314 #20 NEW cov: 12324 ft: 14504 corp: 15/55b lim: 10 exec/s: 20 rss: 72Mb L: 8/8 MS: 3 EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:19.314 [2024-12-05 12:50:22.597360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a66 cdw11:00000000 00:08:19.314 [2024-12-05 12:50:22.597390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.574 #21 NEW cov: 12324 ft: 14554 corp: 16/57b lim: 10 exec/s: 21 rss: 73Mb L: 2/8 MS: 1 ChangeBit- 00:08:19.574 [2024-12-05 12:50:22.667668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002f0e cdw11:00000000 00:08:19.574 [2024-12-05 12:50:22.667700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.574 #22 NEW cov: 12324 ft: 14626 corp: 17/59b lim: 10 exec/s: 22 rss: 73Mb L: 2/8 MS: 1 ChangeBit- 00:08:19.574 [2024-12-05 12:50:22.737859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ed4 cdw11:00000000 00:08:19.574 [2024-12-05 12:50:22.737890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.574 #25 NEW cov: 12324 ft: 14632 corp: 18/62b lim: 10 exec/s: 25 rss: 73Mb L: 3/8 MS: 3 ChangeBit-CopyPart-CrossOver- 00:08:19.574 [2024-12-05 12:50:22.788057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:19.574 [2024-12-05 12:50:22.788087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.574 #26 NEW cov: 12324 ft: 14652 corp: 19/64b lim: 10 exec/s: 26 rss: 73Mb L: 2/8 MS: 1 ShuffleBytes- 00:08:19.574 [2024-12-05 12:50:22.838710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:19.574 [2024-12-05 12:50:22.838740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.574 [2024-12-05 12:50:22.838880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dcd4 cdw11:00000000 00:08:19.574 [2024-12-05 12:50:22.838901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.574 [2024-12-05 12:50:22.839032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000000d4 cdw11:00000000 00:08:19.574 [2024-12-05 12:50:22.839048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.574 #27 NEW cov: 12324 ft: 14680 corp: 20/71b lim: 10 exec/s: 27 rss: 73Mb L: 7/8 MS: 1 CopyPart- 00:08:19.834 [2024-12-05 12:50:22.908691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002aff cdw11:00000000 00:08:19.834 [2024-12-05 12:50:22.908722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.834 [2024-12-05 12:50:22.908859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:19.834 [2024-12-05 12:50:22.908878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.834 #28 NEW cov: 12324 ft: 14740 corp: 21/75b lim: 10 exec/s: 28 rss: 73Mb L: 4/8 MS: 1 PersAutoDict- DE: "\377\000"- 00:08:19.834 [2024-12-05 12:50:22.978741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a0a cdw11:00000000 00:08:19.834 [2024-12-05 12:50:22.978769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.834 #29 NEW cov: 12324 ft: 14748 corp: 22/77b lim: 10 exec/s: 29 rss: 73Mb L: 2/8 MS: 1 ChangeBit- 00:08:19.834 [2024-12-05 12:50:23.049322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006ed4 cdw11:00000000 00:08:19.834 [2024-12-05 12:50:23.049348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.834 [2024-12-05 12:50:23.049478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:19.834 [2024-12-05 12:50:23.049497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.834 [2024-12-05 12:50:23.049634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d492 cdw11:00000000 00:08:19.834 [2024-12-05 12:50:23.049654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.834 #30 NEW cov: 12324 ft: 14760 corp: 23/84b lim: 10 exec/s: 30 rss: 73Mb L: 7/8 MS: 1 InsertByte- 00:08:19.834 [2024-12-05 12:50:23.119409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002aff cdw11:00000000 00:08:19.834 [2024-12-05 12:50:23.119437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.834 [2024-12-05 12:50:23.119577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:19.834 [2024-12-05 12:50:23.119594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.094 #31 NEW cov: 12324 ft: 14786 corp: 24/89b lim: 10 exec/s: 31 rss: 73Mb L: 5/8 MS: 1 InsertByte- 00:08:20.094 [2024-12-05 12:50:23.189567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:20.094 [2024-12-05 12:50:23.189596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.094 #32 NEW cov: 12324 ft: 14798 corp: 25/91b lim: 10 exec/s: 32 rss: 73Mb L: 2/8 MS: 1 ShuffleBytes- 00:08:20.094 [2024-12-05 12:50:23.239774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a8a cdw11:00000000 00:08:20.094 [2024-12-05 12:50:23.239800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.094 #33 NEW cov: 12324 ft: 14811 corp: 26/93b lim: 10 exec/s: 33 rss: 73Mb L: 2/8 MS: 1 ChangeBit- 00:08:20.094 [2024-12-05 12:50:23.290204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a0a cdw11:00000000 00:08:20.094 [2024-12-05 12:50:23.290231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.094 [2024-12-05 12:50:23.290373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:20.094 [2024-12-05 12:50:23.290391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.094 #34 NEW cov: 12324 ft: 14819 corp: 27/97b lim: 10 exec/s: 34 rss: 73Mb L: 4/8 MS: 1 ChangeBit- 00:08:20.094 [2024-12-05 12:50:23.340511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c2f5 cdw11:00000000 00:08:20.094 [2024-12-05 12:50:23.340537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.094 [2024-12-05 12:50:23.340679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f5f5 cdw11:00000000 00:08:20.094 [2024-12-05 12:50:23.340697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.094 #35 NEW cov: 12324 ft: 14827 corp: 28/101b lim: 10 exec/s: 35 rss: 73Mb L: 4/8 MS: 1 ChangeBinInt- 00:08:20.355 [2024-12-05 12:50:23.411265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:20.355 [2024-12-05 12:50:23.411293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.355 [2024-12-05 12:50:23.411430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000dc00 cdw11:00000000 00:08:20.355 [2024-12-05 12:50:23.411450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.355 [2024-12-05 12:50:23.411587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d4d4 cdw11:00000000 00:08:20.355 [2024-12-05 12:50:23.411608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.355 [2024-12-05 12:50:23.411747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000000d4 cdw11:00000000 00:08:20.355 [2024-12-05 12:50:23.411767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.355 #36 NEW cov: 12324 ft: 14838 corp: 29/110b lim: 10 exec/s: 36 rss: 73Mb L: 9/9 MS: 1 CopyPart- 00:08:20.355 [2024-12-05 12:50:23.461239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002aff cdw11:00000000 00:08:20.355 [2024-12-05 12:50:23.461267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.355 [2024-12-05 12:50:23.461405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:08:20.355 [2024-12-05 12:50:23.461424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.355 [2024-12-05 12:50:23.461564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:08:20.355 [2024-12-05 12:50:23.461579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.355 #37 NEW cov: 12324 ft: 14842 corp: 30/116b lim: 10 exec/s: 18 rss: 73Mb L: 6/9 MS: 1 PersAutoDict- DE: "\377\000"- 00:08:20.355 #37 DONE cov: 12324 ft: 14842 corp: 30/116b lim: 10 exec/s: 18 rss: 73Mb 00:08:20.355 ###### Recommended dictionary. ###### 00:08:20.355 "\377\000" # Uses: 2 00:08:20.355 ###### End of recommended dictionary. ###### 00:08:20.355 Done 37 runs in 2 second(s) 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:20.355 12:50:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:20.355 [2024-12-05 12:50:23.629343] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:20.355 [2024-12-05 12:50:23.629430] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid152186 ] 00:08:20.615 [2024-12-05 12:50:23.827490] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.615 [2024-12-05 12:50:23.839788] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.615 [2024-12-05 12:50:23.892385] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.615 [2024-12-05 12:50:23.908697] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:20.615 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.615 INFO: Seed: 2497653270 00:08:20.876 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:20.876 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:20.876 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:20.876 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.876 [2024-12-05 12:50:23.974090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:23.974119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.876 #2 INITED cov: 12106 ft: 12107 corp: 1/1b exec/s: 0 rss: 71Mb 00:08:20.876 [2024-12-05 12:50:24.014098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.014124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.876 #3 NEW cov: 12236 ft: 12608 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeByte- 00:08:20.876 [2024-12-05 12:50:24.074872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.074898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.876 [2024-12-05 12:50:24.074955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.074970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.876 [2024-12-05 12:50:24.075024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.075037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.876 [2024-12-05 12:50:24.075093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.075106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.876 [2024-12-05 12:50:24.075160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.075173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.876 #4 NEW cov: 12242 ft: 13691 corp: 3/7b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:20.876 [2024-12-05 12:50:24.135022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.135051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.876 [2024-12-05 12:50:24.135107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.135121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.876 [2024-12-05 12:50:24.135176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.135188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.876 [2024-12-05 12:50:24.135242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.135255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.876 [2024-12-05 12:50:24.135308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.876 [2024-12-05 12:50:24.135321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.876 #5 NEW cov: 12327 ft: 13920 corp: 4/12b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeByte- 00:08:21.137 [2024-12-05 12:50:24.195248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.195274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.195331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.195345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.195414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.195427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.195485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.195498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.195553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.195566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.137 #6 NEW cov: 12327 ft: 13999 corp: 5/17b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:21.137 [2024-12-05 12:50:24.254930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.254956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.255013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.255030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.137 #7 NEW cov: 12327 ft: 14277 corp: 6/19b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 InsertByte- 00:08:21.137 [2024-12-05 12:50:24.295488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.295514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.295569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.295583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.295635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.295649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.295703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.295716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.295770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.295783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.137 #8 NEW cov: 12327 ft: 14383 corp: 7/24b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeByte- 00:08:21.137 [2024-12-05 12:50:24.334943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.334969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.137 #9 NEW cov: 12327 ft: 14430 corp: 8/25b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 EraseBytes- 00:08:21.137 [2024-12-05 12:50:24.395150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.395175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.137 #10 NEW cov: 12327 ft: 14462 corp: 9/26b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 CrossOver- 00:08:21.137 [2024-12-05 12:50:24.435555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.435580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.435653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.435667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.137 [2024-12-05 12:50:24.435721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.137 [2024-12-05 12:50:24.435734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.398 #11 NEW cov: 12327 ft: 14643 corp: 10/29b lim: 5 exec/s: 0 rss: 72Mb L: 3/5 MS: 1 EraseBytes- 00:08:21.398 [2024-12-05 12:50:24.495909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.495933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.496006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.496020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.496077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.496091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.496147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.496160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.398 #12 NEW cov: 12327 ft: 14703 corp: 11/33b lim: 5 exec/s: 0 rss: 72Mb L: 4/5 MS: 1 CopyPart- 00:08:21.398 [2024-12-05 12:50:24.536003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.536029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.536084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.536098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.536150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.536164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.536215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.536228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.398 #13 NEW cov: 12327 ft: 14776 corp: 12/37b lim: 5 exec/s: 0 rss: 72Mb L: 4/5 MS: 1 EraseBytes- 00:08:21.398 [2024-12-05 12:50:24.575626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.575652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.398 #14 NEW cov: 12327 ft: 14830 corp: 13/38b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:21.398 [2024-12-05 12:50:24.636254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.636279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.636334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.636351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.636405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.636419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.636473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.636486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.398 #15 NEW cov: 12327 ft: 14843 corp: 14/42b lim: 5 exec/s: 0 rss: 72Mb L: 4/5 MS: 1 ChangeBit- 00:08:21.398 [2024-12-05 12:50:24.696277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.696302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.696356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.696369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.398 [2024-12-05 12:50:24.696424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.398 [2024-12-05 12:50:24.696437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.659 #16 NEW cov: 12327 ft: 14865 corp: 15/45b lim: 5 exec/s: 0 rss: 72Mb L: 3/5 MS: 1 ChangeBinInt- 00:08:21.659 [2024-12-05 12:50:24.756772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.756796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.659 [2024-12-05 12:50:24.756856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.756870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.659 [2024-12-05 12:50:24.756942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.756956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.659 [2024-12-05 12:50:24.757011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.757024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.659 [2024-12-05 12:50:24.757079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.757092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.659 #17 NEW cov: 12327 ft: 14905 corp: 16/50b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 ChangeByte- 00:08:21.659 [2024-12-05 12:50:24.796402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.796430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.659 [2024-12-05 12:50:24.796486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.796500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.659 #18 NEW cov: 12327 ft: 14931 corp: 17/52b lim: 5 exec/s: 0 rss: 72Mb L: 2/5 MS: 1 InsertByte- 00:08:21.659 [2024-12-05 12:50:24.836548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.836573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.659 [2024-12-05 12:50:24.836631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.659 [2024-12-05 12:50:24.836645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.919 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:21.919 #19 NEW cov: 12350 ft: 14979 corp: 18/54b lim: 5 exec/s: 19 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:08:21.919 [2024-12-05 12:50:25.167929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.919 [2024-12-05 12:50:25.167983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.919 [2024-12-05 12:50:25.168048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.919 [2024-12-05 12:50:25.168067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.919 [2024-12-05 12:50:25.168137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.919 [2024-12-05 12:50:25.168155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.919 [2024-12-05 12:50:25.168215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.919 [2024-12-05 12:50:25.168233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.919 #20 NEW cov: 12350 ft: 15138 corp: 19/58b lim: 5 exec/s: 20 rss: 74Mb L: 4/5 MS: 1 ChangeBit- 00:08:21.919 [2024-12-05 12:50:25.227638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.919 [2024-12-05 12:50:25.227665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.919 [2024-12-05 12:50:25.227721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.919 [2024-12-05 12:50:25.227735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.919 [2024-12-05 12:50:25.227788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.919 [2024-12-05 12:50:25.227801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.178 #21 NEW cov: 12350 ft: 15176 corp: 20/61b lim: 5 exec/s: 21 rss: 74Mb L: 3/5 MS: 1 ShuffleBytes- 00:08:22.178 [2024-12-05 12:50:25.267403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.267428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.178 #22 NEW cov: 12350 ft: 15258 corp: 21/62b lim: 5 exec/s: 22 rss: 74Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:22.178 [2024-12-05 12:50:25.308006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.308031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.308084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.308097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.308152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.308165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.308218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.308231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.178 #23 NEW cov: 12350 ft: 15271 corp: 22/66b lim: 5 exec/s: 23 rss: 74Mb L: 4/5 MS: 1 ChangeBit- 00:08:22.178 [2024-12-05 12:50:25.368291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.368316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.368368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.368382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.368433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.368446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.368500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.368513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.368566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.368579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.178 #24 NEW cov: 12350 ft: 15306 corp: 23/71b lim: 5 exec/s: 24 rss: 74Mb L: 5/5 MS: 1 ChangeBit- 00:08:22.178 [2024-12-05 12:50:25.408244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.408272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.408327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.408340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.408392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.408405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.408456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.408469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.178 #25 NEW cov: 12350 ft: 15321 corp: 24/75b lim: 5 exec/s: 25 rss: 74Mb L: 4/5 MS: 1 ChangeBit- 00:08:22.178 [2024-12-05 12:50:25.448369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.448394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.448449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.448463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.448514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.448543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.178 [2024-12-05 12:50:25.448597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.178 [2024-12-05 12:50:25.448610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.178 #26 NEW cov: 12350 ft: 15332 corp: 25/79b lim: 5 exec/s: 26 rss: 74Mb L: 4/5 MS: 1 CopyPart- 00:08:22.437 [2024-12-05 12:50:25.508522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.508548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.437 [2024-12-05 12:50:25.508603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.508617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.437 [2024-12-05 12:50:25.508670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.508683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.437 [2024-12-05 12:50:25.508733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.508749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.437 #27 NEW cov: 12350 ft: 15354 corp: 26/83b lim: 5 exec/s: 27 rss: 74Mb L: 4/5 MS: 1 CrossOver- 00:08:22.437 [2024-12-05 12:50:25.548354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.548381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.437 [2024-12-05 12:50:25.548438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.548451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.437 #28 NEW cov: 12350 ft: 15437 corp: 27/85b lim: 5 exec/s: 28 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:08:22.437 [2024-12-05 12:50:25.608350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.608375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.437 #29 NEW cov: 12350 ft: 15510 corp: 28/86b lim: 5 exec/s: 29 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:08:22.437 [2024-12-05 12:50:25.648452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.648476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.437 #30 NEW cov: 12350 ft: 15511 corp: 29/87b lim: 5 exec/s: 30 rss: 74Mb L: 1/5 MS: 1 EraseBytes- 00:08:22.437 [2024-12-05 12:50:25.708825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.708856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.437 [2024-12-05 12:50:25.708911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.437 [2024-12-05 12:50:25.708924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.437 #31 NEW cov: 12350 ft: 15523 corp: 30/89b lim: 5 exec/s: 31 rss: 74Mb L: 2/5 MS: 1 EraseBytes- 00:08:22.697 [2024-12-05 12:50:25.769268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.769292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.769346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.769360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.769412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.769425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.769479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.769492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.697 #32 NEW cov: 12350 ft: 15538 corp: 31/93b lim: 5 exec/s: 32 rss: 74Mb L: 4/5 MS: 1 CMP- DE: "\001\000"- 00:08:22.697 [2024-12-05 12:50:25.809049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.809075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.809128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.809141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.697 #33 NEW cov: 12350 ft: 15547 corp: 32/95b lim: 5 exec/s: 33 rss: 74Mb L: 2/5 MS: 1 EraseBytes- 00:08:22.697 [2024-12-05 12:50:25.869353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.869378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.869431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.869444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.869493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.869507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.697 #34 NEW cov: 12350 ft: 15563 corp: 33/98b lim: 5 exec/s: 34 rss: 74Mb L: 3/5 MS: 1 ShuffleBytes- 00:08:22.697 [2024-12-05 12:50:25.929372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.929396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.929448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.929461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.697 #35 NEW cov: 12350 ft: 15567 corp: 34/100b lim: 5 exec/s: 35 rss: 74Mb L: 2/5 MS: 1 EraseBytes- 00:08:22.697 [2024-12-05 12:50:25.969943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.969969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.970023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.970036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.697 [2024-12-05 12:50:25.970088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.697 [2024-12-05 12:50:25.970117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.698 [2024-12-05 12:50:25.970170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.698 [2024-12-05 12:50:25.970186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.698 [2024-12-05 12:50:25.970238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.698 [2024-12-05 12:50:25.970251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:22.698 #36 NEW cov: 12350 ft: 15578 corp: 35/105b lim: 5 exec/s: 18 rss: 74Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:22.698 #36 DONE cov: 12350 ft: 15578 corp: 35/105b lim: 5 exec/s: 18 rss: 74Mb 00:08:22.698 ###### Recommended dictionary. ###### 00:08:22.698 "\001\000" # Uses: 0 00:08:22.698 ###### End of recommended dictionary. ###### 00:08:22.698 Done 36 runs in 2 second(s) 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:22.957 12:50:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:22.957 [2024-12-05 12:50:26.138576] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:22.957 [2024-12-05 12:50:26.138645] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid152722 ] 00:08:23.217 [2024-12-05 12:50:26.345806] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.217 [2024-12-05 12:50:26.358371] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.217 [2024-12-05 12:50:26.410685] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.217 [2024-12-05 12:50:26.427017] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:23.217 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.217 INFO: Seed: 718650112 00:08:23.217 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:23.217 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:23.217 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:23.217 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.217 [2024-12-05 12:50:26.485401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.217 [2024-12-05 12:50:26.485428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.217 #2 INITED cov: 12124 ft: 12117 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:23.217 [2024-12-05 12:50:26.526108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.217 [2024-12-05 12:50:26.526133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.217 [2024-12-05 12:50:26.526190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.217 [2024-12-05 12:50:26.526205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.217 [2024-12-05 12:50:26.526258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.217 [2024-12-05 12:50:26.526271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.217 [2024-12-05 12:50:26.526326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.217 [2024-12-05 12:50:26.526339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.217 [2024-12-05 12:50:26.526393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.217 [2024-12-05 12:50:26.526406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.477 #3 NEW cov: 12237 ft: 13462 corp: 2/6b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:23.477 [2024-12-05 12:50:26.586246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.477 [2024-12-05 12:50:26.586270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.477 [2024-12-05 12:50:26.586327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.477 [2024-12-05 12:50:26.586341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.477 [2024-12-05 12:50:26.586396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.477 [2024-12-05 12:50:26.586409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.477 [2024-12-05 12:50:26.586462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.477 [2024-12-05 12:50:26.586475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.477 [2024-12-05 12:50:26.586527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.477 [2024-12-05 12:50:26.586544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.477 #4 NEW cov: 12243 ft: 13691 corp: 3/11b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeByte- 00:08:23.477 [2024-12-05 12:50:26.646353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.477 [2024-12-05 12:50:26.646378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.646434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.646447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.646501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.646531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.646585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.646598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.646651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.646664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.478 #5 NEW cov: 12328 ft: 13965 corp: 4/16b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeBit- 00:08:23.478 [2024-12-05 12:50:26.706538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.706563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.706616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.706630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.706682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.706694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.706746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.706758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.706811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.706823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.478 #6 NEW cov: 12328 ft: 14088 corp: 5/21b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:23.478 [2024-12-05 12:50:26.746678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.746703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.746759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.746772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.746826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.746843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.746898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.746910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.478 [2024-12-05 12:50:26.746963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.478 [2024-12-05 12:50:26.746976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.478 #7 NEW cov: 12328 ft: 14160 corp: 6/26b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CopyPart- 00:08:23.738 [2024-12-05 12:50:26.806544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.738 [2024-12-05 12:50:26.806570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.738 [2024-12-05 12:50:26.806622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.738 [2024-12-05 12:50:26.806636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.738 [2024-12-05 12:50:26.806689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.738 [2024-12-05 12:50:26.806702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.738 #8 NEW cov: 12328 ft: 14412 corp: 7/29b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 EraseBytes- 00:08:23.739 [2024-12-05 12:50:26.846626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.846650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:26.846706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.846720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:26.846774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.846787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.739 #9 NEW cov: 12328 ft: 14486 corp: 8/32b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 CrossOver- 00:08:23.739 [2024-12-05 12:50:26.906756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.906780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:26.906841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.906855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:26.906926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.906940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.739 #10 NEW cov: 12328 ft: 14573 corp: 9/35b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 ChangeBinInt- 00:08:23.739 [2024-12-05 12:50:26.967264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.967288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:26.967343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.967356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:26.967410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.967423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:26.967477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.967489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:26.967544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:26.967557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.739 #11 NEW cov: 12328 ft: 14613 corp: 10/40b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:23.739 [2024-12-05 12:50:27.007354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:27.007379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:27.007433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:27.007446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:27.007497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:27.007509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:27.007564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:27.007577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:27.007630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:27.007642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:23.739 #12 NEW cov: 12328 ft: 14697 corp: 11/45b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:08:23.739 [2024-12-05 12:50:27.047181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:27.047205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:27.047274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:27.047288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.739 [2024-12-05 12:50:27.047342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.739 [2024-12-05 12:50:27.047355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.000 #13 NEW cov: 12328 ft: 14741 corp: 12/48b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 EraseBytes- 00:08:24.000 [2024-12-05 12:50:27.087445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.087470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.000 [2024-12-05 12:50:27.087529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.087543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.000 [2024-12-05 12:50:27.087599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.087612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.000 [2024-12-05 12:50:27.087666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.087680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.000 #14 NEW cov: 12328 ft: 14760 corp: 13/52b lim: 5 exec/s: 0 rss: 71Mb L: 4/5 MS: 1 EraseBytes- 00:08:24.000 [2024-12-05 12:50:27.147764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.147791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.000 [2024-12-05 12:50:27.147851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.147872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.000 [2024-12-05 12:50:27.147933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.147947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.000 [2024-12-05 12:50:27.148003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.148016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.000 [2024-12-05 12:50:27.148069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.148082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.000 #15 NEW cov: 12328 ft: 14772 corp: 14/57b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeByte- 00:08:24.000 [2024-12-05 12:50:27.187856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.000 [2024-12-05 12:50:27.187881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.000 [2024-12-05 12:50:27.187937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.187950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.188021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.188035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.188089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.188102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.188157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.188170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.001 #16 NEW cov: 12328 ft: 14867 corp: 15/62b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:08:24.001 [2024-12-05 12:50:27.248021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.248046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.248117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.248131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.248184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.248198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.248252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.248266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.248318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.248332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.001 #17 NEW cov: 12328 ft: 14925 corp: 16/67b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CopyPart- 00:08:24.001 [2024-12-05 12:50:27.287826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.287855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.287910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.287923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.001 [2024-12-05 12:50:27.287977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.001 [2024-12-05 12:50:27.287990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.001 #18 NEW cov: 12328 ft: 14947 corp: 17/70b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 ChangeBit- 00:08:24.261 [2024-12-05 12:50:27.327920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.262 [2024-12-05 12:50:27.327945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.262 [2024-12-05 12:50:27.327999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.262 [2024-12-05 12:50:27.328013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.262 [2024-12-05 12:50:27.328069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.262 [2024-12-05 12:50:27.328082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.522 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:24.522 #19 NEW cov: 12351 ft: 14984 corp: 18/73b lim: 5 exec/s: 19 rss: 73Mb L: 3/5 MS: 1 EraseBytes- 00:08:24.522 [2024-12-05 12:50:27.659354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.659406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.522 [2024-12-05 12:50:27.659488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.659514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.522 [2024-12-05 12:50:27.659591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.659621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.522 [2024-12-05 12:50:27.659697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.659722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.522 [2024-12-05 12:50:27.659800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.659824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.522 #20 NEW cov: 12351 ft: 15212 corp: 19/78b lim: 5 exec/s: 20 rss: 73Mb L: 5/5 MS: 1 ChangeByte- 00:08:24.522 [2024-12-05 12:50:27.708858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.708882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.522 [2024-12-05 12:50:27.708938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.708951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.522 [2024-12-05 12:50:27.709004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.709017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.522 #21 NEW cov: 12351 ft: 15227 corp: 20/81b lim: 5 exec/s: 21 rss: 73Mb L: 3/5 MS: 1 ShuffleBytes- 00:08:24.522 [2024-12-05 12:50:27.749233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.522 [2024-12-05 12:50:27.749258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.522 [2024-12-05 12:50:27.749314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.749327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.523 [2024-12-05 12:50:27.749378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.749391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.523 [2024-12-05 12:50:27.749443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.749456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.523 [2024-12-05 12:50:27.749508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.749520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.523 #22 NEW cov: 12351 ft: 15243 corp: 21/86b lim: 5 exec/s: 22 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:08:24.523 [2024-12-05 12:50:27.788932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.788957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.523 [2024-12-05 12:50:27.789010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.789024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.523 #23 NEW cov: 12351 ft: 15412 corp: 22/88b lim: 5 exec/s: 23 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:08:24.523 [2024-12-05 12:50:27.829456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.829480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.523 [2024-12-05 12:50:27.829535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.829548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.523 [2024-12-05 12:50:27.829599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.829613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.523 [2024-12-05 12:50:27.829662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.829675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.523 [2024-12-05 12:50:27.829729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.523 [2024-12-05 12:50:27.829741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.783 #24 NEW cov: 12351 ft: 15441 corp: 23/93b lim: 5 exec/s: 24 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:24.783 [2024-12-05 12:50:27.889680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.783 [2024-12-05 12:50:27.889704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.783 [2024-12-05 12:50:27.889758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.783 [2024-12-05 12:50:27.889771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.783 [2024-12-05 12:50:27.889824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.783 [2024-12-05 12:50:27.889841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.783 [2024-12-05 12:50:27.889893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.783 [2024-12-05 12:50:27.889906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.783 [2024-12-05 12:50:27.889957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.783 [2024-12-05 12:50:27.889972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.783 #25 NEW cov: 12351 ft: 15459 corp: 24/98b lim: 5 exec/s: 25 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:24.784 [2024-12-05 12:50:27.929443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:27.929467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:27.929521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:27.929535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:27.929586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:27.929615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.784 #26 NEW cov: 12351 ft: 15465 corp: 25/101b lim: 5 exec/s: 26 rss: 73Mb L: 3/5 MS: 1 ChangeBit- 00:08:24.784 [2024-12-05 12:50:27.989949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:27.989973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:27.990044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:27.990058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:27.990112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:27.990125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:27.990178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:27.990191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:27.990245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:27.990258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.784 #27 NEW cov: 12351 ft: 15505 corp: 26/106b lim: 5 exec/s: 27 rss: 73Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:24.784 [2024-12-05 12:50:28.030051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.030075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:28.030130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.030144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:28.030201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.030214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:28.030266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.030279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:28.030333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.030345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.784 #28 NEW cov: 12351 ft: 15515 corp: 27/111b lim: 5 exec/s: 28 rss: 73Mb L: 5/5 MS: 1 ChangeByte- 00:08:24.784 [2024-12-05 12:50:28.090255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.090279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:28.090332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.090345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:28.090397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.090410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:28.090462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.090474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.784 [2024-12-05 12:50:28.090528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.784 [2024-12-05 12:50:28.090540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.045 #29 NEW cov: 12351 ft: 15524 corp: 28/116b lim: 5 exec/s: 29 rss: 73Mb L: 5/5 MS: 1 ChangeBit- 00:08:25.045 [2024-12-05 12:50:28.130298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.130322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.130375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.130389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.130438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.130451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.130506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.130518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.130568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.130580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.045 #30 NEW cov: 12351 ft: 15539 corp: 29/121b lim: 5 exec/s: 30 rss: 73Mb L: 5/5 MS: 1 ChangeBit- 00:08:25.045 [2024-12-05 12:50:28.170462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.170485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.170538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.170551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.170605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.170617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.170668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.170681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.170734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.170747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.045 #31 NEW cov: 12351 ft: 15553 corp: 30/126b lim: 5 exec/s: 31 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:25.045 [2024-12-05 12:50:28.210141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.210165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.210219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.210232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.045 #32 NEW cov: 12351 ft: 15563 corp: 31/128b lim: 5 exec/s: 32 rss: 73Mb L: 2/5 MS: 1 CrossOver- 00:08:25.045 [2024-12-05 12:50:28.270753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.270777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.270836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.270849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.270905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.270917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.270969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.270982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.271035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.271047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.045 #33 NEW cov: 12351 ft: 15568 corp: 32/133b lim: 5 exec/s: 33 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:25.045 [2024-12-05 12:50:28.310815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.310844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.310895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.310908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.310959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.310972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.311022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.311035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.311086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.311099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.045 #34 NEW cov: 12351 ft: 15574 corp: 33/138b lim: 5 exec/s: 34 rss: 73Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:25.045 [2024-12-05 12:50:28.350649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.350673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.350725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.350738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.045 [2024-12-05 12:50:28.350790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.045 [2024-12-05 12:50:28.350802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.306 #35 NEW cov: 12351 ft: 15615 corp: 34/141b lim: 5 exec/s: 35 rss: 73Mb L: 3/5 MS: 1 EraseBytes- 00:08:25.306 [2024-12-05 12:50:28.411128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.306 [2024-12-05 12:50:28.411152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.306 [2024-12-05 12:50:28.411205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.306 [2024-12-05 12:50:28.411218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.306 [2024-12-05 12:50:28.411269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.306 [2024-12-05 12:50:28.411282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.306 [2024-12-05 12:50:28.411335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.306 [2024-12-05 12:50:28.411347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.306 [2024-12-05 12:50:28.411399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.306 [2024-12-05 12:50:28.411411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.306 #36 NEW cov: 12351 ft: 15631 corp: 35/146b lim: 5 exec/s: 36 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:25.306 [2024-12-05 12:50:28.470855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.306 [2024-12-05 12:50:28.470879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.306 [2024-12-05 12:50:28.470932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.306 [2024-12-05 12:50:28.470945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.306 #37 NEW cov: 12351 ft: 15647 corp: 36/148b lim: 5 exec/s: 18 rss: 74Mb L: 2/5 MS: 1 ChangeBinInt- 00:08:25.306 #37 DONE cov: 12351 ft: 15647 corp: 36/148b lim: 5 exec/s: 18 rss: 74Mb 00:08:25.306 Done 37 runs in 2 second(s) 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:25.306 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:25.567 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:25.567 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.567 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.567 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:25.567 12:50:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:25.567 [2024-12-05 12:50:28.656837] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:25.567 [2024-12-05 12:50:28.656910] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid153011 ] 00:08:25.567 [2024-12-05 12:50:28.865137] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.567 [2024-12-05 12:50:28.877980] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.827 [2024-12-05 12:50:28.930425] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.827 [2024-12-05 12:50:28.946747] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:25.828 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.828 INFO: Seed: 3240653357 00:08:25.828 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:25.828 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:25.828 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:25.828 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.828 #2 INITED exec/s: 0 rss: 64Mb 00:08:25.828 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.828 This may also happen if the target rejected all inputs we tried so far 00:08:25.828 [2024-12-05 12:50:29.023873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.828 [2024-12-05 12:50:29.023910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.828 [2024-12-05 12:50:29.024067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.828 [2024-12-05 12:50:29.024086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.828 [2024-12-05 12:50:29.024233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.828 [2024-12-05 12:50:29.024252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.828 [2024-12-05 12:50:29.024407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.828 [2024-12-05 12:50:29.024427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.114 NEW_FUNC[1/716]: 0x45f668 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:26.114 NEW_FUNC[2/716]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.114 #8 NEW cov: 12146 ft: 12148 corp: 2/35b lim: 40 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:26.114 [2024-12-05 12:50:29.354589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.114 [2024-12-05 12:50:29.354636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.114 [2024-12-05 12:50:29.354784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.114 [2024-12-05 12:50:29.354807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.114 [2024-12-05 12:50:29.354968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.114 [2024-12-05 12:50:29.354991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.114 [2024-12-05 12:50:29.355136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d5000000 cdw11:00d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.114 [2024-12-05 12:50:29.355157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.114 #9 NEW cov: 12260 ft: 12879 corp: 3/73b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:26.114 [2024-12-05 12:50:29.423967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.114 [2024-12-05 12:50:29.424001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.374 #13 NEW cov: 12266 ft: 13784 corp: 4/85b lim: 40 exec/s: 0 rss: 72Mb L: 12/38 MS: 4 CopyPart-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:26.374 [2024-12-05 12:50:29.474076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.374 [2024-12-05 12:50:29.474106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.374 #18 NEW cov: 12351 ft: 13976 corp: 5/100b lim: 40 exec/s: 0 rss: 72Mb L: 15/38 MS: 5 CopyPart-ShuffleBytes-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:26.374 [2024-12-05 12:50:29.524428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.374 [2024-12-05 12:50:29.524455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.374 [2024-12-05 12:50:29.524592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:0000d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.374 [2024-12-05 12:50:29.524609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.375 #19 NEW cov: 12351 ft: 14285 corp: 6/123b lim: 40 exec/s: 0 rss: 72Mb L: 23/38 MS: 1 EraseBytes- 00:08:26.375 [2024-12-05 12:50:29.594477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.375 [2024-12-05 12:50:29.594504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.375 #20 NEW cov: 12351 ft: 14407 corp: 7/135b lim: 40 exec/s: 0 rss: 72Mb L: 12/38 MS: 1 CMP- DE: "\012\000"- 00:08:26.375 [2024-12-05 12:50:29.664665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.375 [2024-12-05 12:50:29.664693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.635 #21 NEW cov: 12351 ft: 14509 corp: 8/150b lim: 40 exec/s: 0 rss: 72Mb L: 15/38 MS: 1 CopyPart- 00:08:26.635 [2024-12-05 12:50:29.725497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.725526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.725665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.725684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.725819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.725838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.725965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.725981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.635 #22 NEW cov: 12351 ft: 14575 corp: 9/185b lim: 40 exec/s: 0 rss: 72Mb L: 35/38 MS: 1 CopyPart- 00:08:26.635 [2024-12-05 12:50:29.775357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.775384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.775513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffbfbfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.775529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.775660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:bfbfbfbf cdw11:bfbfbfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.775676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.635 #23 NEW cov: 12351 ft: 14771 corp: 10/215b lim: 40 exec/s: 0 rss: 72Mb L: 30/38 MS: 1 InsertRepeatedBytes- 00:08:26.635 [2024-12-05 12:50:29.815314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.815340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.815464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.815479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.635 #24 NEW cov: 12351 ft: 14870 corp: 11/234b lim: 40 exec/s: 0 rss: 72Mb L: 19/38 MS: 1 EraseBytes- 00:08:26.635 [2024-12-05 12:50:29.885715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.885741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.885891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d50000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.885908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.886038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0000d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.886054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.635 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:26.635 #25 NEW cov: 12374 ft: 14909 corp: 12/261b lim: 40 exec/s: 0 rss: 73Mb L: 27/38 MS: 1 EraseBytes- 00:08:26.635 [2024-12-05 12:50:29.935675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.935701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.635 [2024-12-05 12:50:29.935840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d50000d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.635 [2024-12-05 12:50:29.935856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.895 #26 NEW cov: 12374 ft: 14931 corp: 13/281b lim: 40 exec/s: 0 rss: 73Mb L: 20/38 MS: 1 EraseBytes- 00:08:26.895 [2024-12-05 12:50:29.986028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.895 [2024-12-05 12:50:29.986054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.895 [2024-12-05 12:50:29.986186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffbfbfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.895 [2024-12-05 12:50:29.986202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.895 [2024-12-05 12:50:29.986330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:bfbfbfbf cdw11:bfbfbfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.895 [2024-12-05 12:50:29.986345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.895 #27 NEW cov: 12374 ft: 14949 corp: 14/311b lim: 40 exec/s: 27 rss: 73Mb L: 30/38 MS: 1 ShuffleBytes- 00:08:26.895 [2024-12-05 12:50:30.056448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.895 [2024-12-05 12:50:30.056476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.895 [2024-12-05 12:50:30.056607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2e2a2a2a cdw11:d5d50000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.895 [2024-12-05 12:50:30.056625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.895 [2024-12-05 12:50:30.056755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0000d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.895 [2024-12-05 12:50:30.056772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.895 #28 NEW cov: 12374 ft: 14997 corp: 15/338b lim: 40 exec/s: 28 rss: 73Mb L: 27/38 MS: 1 ChangeBinInt- 00:08:26.896 [2024-12-05 12:50:30.126069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:00002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.896 [2024-12-05 12:50:30.126097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.896 #29 NEW cov: 12374 ft: 15033 corp: 16/350b lim: 40 exec/s: 29 rss: 73Mb L: 12/38 MS: 1 ChangeBit- 00:08:26.896 [2024-12-05 12:50:30.196279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:00002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.896 [2024-12-05 12:50:30.196306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.156 #30 NEW cov: 12374 ft: 15077 corp: 17/362b lim: 40 exec/s: 30 rss: 73Mb L: 12/38 MS: 1 ChangeByte- 00:08:27.157 [2024-12-05 12:50:30.266919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.157 [2024-12-05 12:50:30.266948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.157 [2024-12-05 12:50:30.267075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d50000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.157 [2024-12-05 12:50:30.267092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.157 [2024-12-05 12:50:30.267220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0000d5d5 cdw11:1b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.157 [2024-12-05 12:50:30.267239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.157 #31 NEW cov: 12374 ft: 15116 corp: 18/389b lim: 40 exec/s: 31 rss: 73Mb L: 27/38 MS: 1 ChangeBinInt- 00:08:27.157 [2024-12-05 12:50:30.316596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:23000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.157 [2024-12-05 12:50:30.316625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.157 #32 NEW cov: 12374 ft: 15137 corp: 19/402b lim: 40 exec/s: 32 rss: 73Mb L: 13/38 MS: 1 InsertByte- 00:08:27.157 [2024-12-05 12:50:30.387066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.157 [2024-12-05 12:50:30.387095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.157 [2024-12-05 12:50:30.387234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.157 [2024-12-05 12:50:30.387252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.157 #33 NEW cov: 12374 ft: 15151 corp: 20/423b lim: 40 exec/s: 33 rss: 73Mb L: 21/38 MS: 1 EraseBytes- 00:08:27.157 [2024-12-05 12:50:30.437007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.157 [2024-12-05 12:50:30.437035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.157 #34 NEW cov: 12374 ft: 15161 corp: 21/437b lim: 40 exec/s: 34 rss: 73Mb L: 14/38 MS: 1 PersAutoDict- DE: "\012\000"- 00:08:27.416 [2024-12-05 12:50:30.487798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.487825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.417 [2024-12-05 12:50:30.487956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.487972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.417 [2024-12-05 12:50:30.488096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5c5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.488113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.417 [2024-12-05 12:50:30.488248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.488265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.417 #35 NEW cov: 12374 ft: 15169 corp: 22/472b lim: 40 exec/s: 35 rss: 73Mb L: 35/38 MS: 1 ChangeBit- 00:08:27.417 [2024-12-05 12:50:30.537273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:2000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.537302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.417 #36 NEW cov: 12374 ft: 15185 corp: 23/484b lim: 40 exec/s: 36 rss: 73Mb L: 12/38 MS: 1 CopyPart- 00:08:27.417 [2024-12-05 12:50:30.587998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.588027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.417 [2024-12-05 12:50:30.588157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffbfbfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.588176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.417 [2024-12-05 12:50:30.588306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:bfbfbf28 cdw11:bfbfbfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.588324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.417 #37 NEW cov: 12374 ft: 15200 corp: 24/515b lim: 40 exec/s: 37 rss: 73Mb L: 31/38 MS: 1 InsertByte- 00:08:27.417 [2024-12-05 12:50:30.657717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:0fffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.657745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.417 #38 NEW cov: 12374 ft: 15222 corp: 25/530b lim: 40 exec/s: 38 rss: 73Mb L: 15/38 MS: 1 ChangeBinInt- 00:08:27.417 [2024-12-05 12:50:30.707823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:0fffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.417 [2024-12-05 12:50:30.707853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.677 #39 NEW cov: 12374 ft: 15249 corp: 26/545b lim: 40 exec/s: 39 rss: 73Mb L: 15/38 MS: 1 ChangeByte- 00:08:27.677 [2024-12-05 12:50:30.778003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.778031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.677 #40 NEW cov: 12374 ft: 15266 corp: 27/554b lim: 40 exec/s: 40 rss: 73Mb L: 9/38 MS: 1 EraseBytes- 00:08:27.677 [2024-12-05 12:50:30.828865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.828889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.677 [2024-12-05 12:50:30.829024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.829041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.677 [2024-12-05 12:50:30.829171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d586 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.829188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.677 [2024-12-05 12:50:30.829315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.829332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.677 #41 NEW cov: 12374 ft: 15281 corp: 28/588b lim: 40 exec/s: 41 rss: 73Mb L: 34/38 MS: 1 ChangeByte- 00:08:27.677 [2024-12-05 12:50:30.878289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.878315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.677 #42 NEW cov: 12374 ft: 15283 corp: 29/600b lim: 40 exec/s: 42 rss: 73Mb L: 12/38 MS: 1 ShuffleBytes- 00:08:27.677 [2024-12-05 12:50:30.918391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac20000 cdw11:23000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.918416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.677 #43 NEW cov: 12374 ft: 15289 corp: 30/613b lim: 40 exec/s: 43 rss: 73Mb L: 13/38 MS: 1 ChangeBinInt- 00:08:27.677 [2024-12-05 12:50:30.989321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d50d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.989347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.677 [2024-12-05 12:50:30.989476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.989494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.677 [2024-12-05 12:50:30.989623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.989641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.677 [2024-12-05 12:50:30.989776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:d5d5d5d5 cdw11:d5d5d5d5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.677 [2024-12-05 12:50:30.989793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.937 #44 NEW cov: 12374 ft: 15308 corp: 31/648b lim: 40 exec/s: 22 rss: 73Mb L: 35/38 MS: 1 InsertByte- 00:08:27.937 #44 DONE cov: 12374 ft: 15308 corp: 31/648b lim: 40 exec/s: 22 rss: 73Mb 00:08:27.937 ###### Recommended dictionary. ###### 00:08:27.937 "\012\000" # Uses: 1 00:08:27.937 ###### End of recommended dictionary. ###### 00:08:27.937 Done 44 runs in 2 second(s) 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:27.937 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:27.938 12:50:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:27.938 [2024-12-05 12:50:31.152599] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:27.938 [2024-12-05 12:50:31.152682] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid153541 ] 00:08:28.198 [2024-12-05 12:50:31.351062] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.198 [2024-12-05 12:50:31.363589] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.198 [2024-12-05 12:50:31.415907] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.198 [2024-12-05 12:50:31.432172] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:28.198 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.198 INFO: Seed: 1430681786 00:08:28.198 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:28.198 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:28.198 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:28.198 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.198 #2 INITED exec/s: 0 rss: 64Mb 00:08:28.198 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.198 This may also happen if the target rejected all inputs we tried so far 00:08:28.198 [2024-12-05 12:50:31.498110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.198 [2024-12-05 12:50:31.498138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.198 [2024-12-05 12:50:31.498198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.198 [2024-12-05 12:50:31.498212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.198 [2024-12-05 12:50:31.498265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.198 [2024-12-05 12:50:31.498279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.198 [2024-12-05 12:50:31.498334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.198 [2024-12-05 12:50:31.498346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.719 NEW_FUNC[1/717]: 0x4613d8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:28.719 NEW_FUNC[2/717]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.719 #4 NEW cov: 12159 ft: 12152 corp: 2/35b lim: 40 exec/s: 0 rss: 72Mb L: 34/34 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:28.719 [2024-12-05 12:50:31.829383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.829459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.719 [2024-12-05 12:50:31.829570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.829608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.719 [2024-12-05 12:50:31.829716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.829751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.719 [2024-12-05 12:50:31.829870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.829906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.719 #5 NEW cov: 12272 ft: 12758 corp: 3/69b lim: 40 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 ChangeBit- 00:08:28.719 [2024-12-05 12:50:31.898971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.898996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.719 [2024-12-05 12:50:31.899053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000008fe cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.899066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.719 [2024-12-05 12:50:31.899121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.899134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.719 [2024-12-05 12:50:31.899189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.899204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.719 #6 NEW cov: 12278 ft: 13081 corp: 4/103b lim: 40 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:28.719 [2024-12-05 12:50:31.958659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.958683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.719 #7 NEW cov: 12363 ft: 14192 corp: 5/112b lim: 40 exec/s: 0 rss: 72Mb L: 9/34 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:28.719 [2024-12-05 12:50:31.998746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.719 [2024-12-05 12:50:31.998770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.719 #9 NEW cov: 12363 ft: 14357 corp: 6/121b lim: 40 exec/s: 0 rss: 72Mb L: 9/34 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:28.980 [2024-12-05 12:50:32.039456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.039481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.980 [2024-12-05 12:50:32.039555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.039569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.980 [2024-12-05 12:50:32.039625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.039639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.980 [2024-12-05 12:50:32.039695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.039709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.980 [2024-12-05 12:50:32.039762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.039775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:28.980 #15 NEW cov: 12363 ft: 14466 corp: 7/161b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 CopyPart- 00:08:28.980 [2024-12-05 12:50:32.078967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.078991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.980 #16 NEW cov: 12363 ft: 14501 corp: 8/170b lim: 40 exec/s: 0 rss: 72Mb L: 9/40 MS: 1 CopyPart- 00:08:28.980 [2024-12-05 12:50:32.139594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.139618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.980 [2024-12-05 12:50:32.139674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000008fe cdw11:00002e00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.139690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.980 [2024-12-05 12:50:32.139759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.139772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.980 [2024-12-05 12:50:32.139830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.980 [2024-12-05 12:50:32.139853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.980 #17 NEW cov: 12363 ft: 14580 corp: 9/204b lim: 40 exec/s: 0 rss: 72Mb L: 34/40 MS: 1 ChangeByte- 00:08:28.980 [2024-12-05 12:50:32.199949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.981 [2024-12-05 12:50:32.199974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.981 [2024-12-05 12:50:32.200031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000002d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.981 [2024-12-05 12:50:32.200044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.981 [2024-12-05 12:50:32.200101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.981 [2024-12-05 12:50:32.200115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.981 [2024-12-05 12:50:32.200169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.981 [2024-12-05 12:50:32.200181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.981 [2024-12-05 12:50:32.200236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.981 [2024-12-05 12:50:32.200249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:28.981 #18 NEW cov: 12363 ft: 14629 corp: 10/244b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeByte- 00:08:28.981 [2024-12-05 12:50:32.259652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.981 [2024-12-05 12:50:32.259676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.981 [2024-12-05 12:50:32.259733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.981 [2024-12-05 12:50:32.259746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.981 #20 NEW cov: 12363 ft: 14881 corp: 11/260b lim: 40 exec/s: 0 rss: 72Mb L: 16/40 MS: 2 CrossOver-CrossOver- 00:08:29.241 [2024-12-05 12:50:32.300086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.241 [2024-12-05 12:50:32.300111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.241 [2024-12-05 12:50:32.300169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000008fe cdw11:00002e00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.241 [2024-12-05 12:50:32.300186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.241 [2024-12-05 12:50:32.300241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:06000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.241 [2024-12-05 12:50:32.300254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.241 [2024-12-05 12:50:32.300327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.241 [2024-12-05 12:50:32.300341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.241 #21 NEW cov: 12363 ft: 14916 corp: 12/294b lim: 40 exec/s: 0 rss: 72Mb L: 34/40 MS: 1 ChangeBinInt- 00:08:29.241 [2024-12-05 12:50:32.359929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:08000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.359952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.360008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.360022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.242 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:29.242 #22 NEW cov: 12386 ft: 14957 corp: 13/311b lim: 40 exec/s: 0 rss: 73Mb L: 17/40 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:29.242 [2024-12-05 12:50:32.420396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.420420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.420477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000008fe cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.420491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.420561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.420575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.420630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00003900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.420643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.242 #23 NEW cov: 12386 ft: 15058 corp: 14/345b lim: 40 exec/s: 0 rss: 73Mb L: 34/40 MS: 1 ChangeByte- 00:08:29.242 [2024-12-05 12:50:32.460330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.460354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.460427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.460441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.460500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.460513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.242 #25 NEW cov: 12386 ft: 15265 corp: 15/376b lim: 40 exec/s: 25 rss: 73Mb L: 31/40 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:29.242 [2024-12-05 12:50:32.500598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.500621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.500678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.500692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.500747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.500761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.242 [2024-12-05 12:50:32.500814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.500827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.242 #26 NEW cov: 12386 ft: 15294 corp: 16/410b lim: 40 exec/s: 26 rss: 73Mb L: 34/40 MS: 1 ShuffleBytes- 00:08:29.242 [2024-12-05 12:50:32.540280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.242 [2024-12-05 12:50:32.540304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.503 #28 NEW cov: 12386 ft: 15327 corp: 17/423b lim: 40 exec/s: 28 rss: 73Mb L: 13/40 MS: 2 EraseBytes-PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:29.503 [2024-12-05 12:50:32.580523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.580547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.580620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:98000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.580634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.503 #29 NEW cov: 12386 ft: 15356 corp: 18/439b lim: 40 exec/s: 29 rss: 73Mb L: 16/40 MS: 1 ChangeByte- 00:08:29.503 [2024-12-05 12:50:32.641161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.641185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.641242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000002d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.641255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.641310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.641327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.641381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.641393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.641451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.641463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:29.503 #30 NEW cov: 12386 ft: 15376 corp: 19/479b lim: 40 exec/s: 30 rss: 73Mb L: 40/40 MS: 1 ChangeByte- 00:08:29.503 [2024-12-05 12:50:32.700722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.700746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.503 #31 NEW cov: 12386 ft: 15383 corp: 20/488b lim: 40 exec/s: 31 rss: 73Mb L: 9/40 MS: 1 CrossOver- 00:08:29.503 [2024-12-05 12:50:32.760921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f8ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.760945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.503 #32 NEW cov: 12386 ft: 15408 corp: 21/497b lim: 40 exec/s: 32 rss: 73Mb L: 9/40 MS: 1 ChangeBinInt- 00:08:29.503 [2024-12-05 12:50:32.801617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.801642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.801700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.801714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.801770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.801784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.801843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.801856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.503 [2024-12-05 12:50:32.801913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.503 [2024-12-05 12:50:32.801927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:29.764 #33 NEW cov: 12386 ft: 15483 corp: 22/537b lim: 40 exec/s: 33 rss: 73Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:29.764 [2024-12-05 12:50:32.841673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.841698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:32.841773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.841791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:32.841852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.841866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:32.841922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.841935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:32.841991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.842004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:29.764 #34 NEW cov: 12386 ft: 15533 corp: 23/577b lim: 40 exec/s: 34 rss: 73Mb L: 40/40 MS: 1 CrossOver- 00:08:29.764 [2024-12-05 12:50:32.901292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.901317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.764 #36 NEW cov: 12386 ft: 15544 corp: 24/591b lim: 40 exec/s: 36 rss: 73Mb L: 14/40 MS: 2 EraseBytes-PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:29.764 [2024-12-05 12:50:32.961918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.961943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:32.962002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:fe00002e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.962015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:32.962090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00060000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.962103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:32.962158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:32.962171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.764 #37 NEW cov: 12386 ft: 15606 corp: 25/625b lim: 40 exec/s: 37 rss: 73Mb L: 34/40 MS: 1 CopyPart- 00:08:29.764 [2024-12-05 12:50:33.021940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:33.021964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:33.022023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:33.022037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.764 [2024-12-05 12:50:33.022096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.764 [2024-12-05 12:50:33.022109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.764 #38 NEW cov: 12386 ft: 15621 corp: 26/656b lim: 40 exec/s: 38 rss: 73Mb L: 31/40 MS: 1 ShuffleBytes- 00:08:30.025 [2024-12-05 12:50:33.081821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00002900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.081852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.025 #39 NEW cov: 12386 ft: 15627 corp: 27/665b lim: 40 exec/s: 39 rss: 74Mb L: 9/40 MS: 1 ChangeByte- 00:08:30.025 [2024-12-05 12:50:33.121877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00003d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.121901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.025 #40 NEW cov: 12386 ft: 15653 corp: 28/675b lim: 40 exec/s: 40 rss: 74Mb L: 10/40 MS: 1 InsertByte- 00:08:30.025 [2024-12-05 12:50:33.162465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.162489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.162548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.162562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.162619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.162632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.162690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00003900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.162703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.025 #41 NEW cov: 12386 ft: 15665 corp: 29/709b lim: 40 exec/s: 41 rss: 74Mb L: 34/40 MS: 1 CrossOver- 00:08:30.025 [2024-12-05 12:50:33.222815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.222844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.222901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000002d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.222914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.222985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0b000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.222999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.223053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0000b900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.223066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.223125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.223138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:30.025 #42 NEW cov: 12386 ft: 15685 corp: 30/749b lim: 40 exec/s: 42 rss: 74Mb L: 40/40 MS: 1 ChangeByte- 00:08:30.025 [2024-12-05 12:50:33.283024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.283049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.283108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.283121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.283178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.283192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.283252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.283265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.283321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.283334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:30.025 #43 NEW cov: 12386 ft: 15692 corp: 31/789b lim: 40 exec/s: 43 rss: 74Mb L: 40/40 MS: 1 CopyPart- 00:08:30.025 [2024-12-05 12:50:33.322955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.322980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.323034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:0000003d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.323048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.323101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.323114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.025 [2024-12-05 12:50:33.323167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.025 [2024-12-05 12:50:33.323180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.286 #44 NEW cov: 12386 ft: 15708 corp: 32/823b lim: 40 exec/s: 44 rss: 74Mb L: 34/40 MS: 1 ChangeByte- 00:08:30.286 [2024-12-05 12:50:33.363000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.363028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.286 [2024-12-05 12:50:33.363084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000c3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.363098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.286 [2024-12-05 12:50:33.363152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.363166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.286 [2024-12-05 12:50:33.363221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.363234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.286 #45 NEW cov: 12386 ft: 15737 corp: 33/857b lim: 40 exec/s: 45 rss: 74Mb L: 34/40 MS: 1 ChangeByte- 00:08:30.286 [2024-12-05 12:50:33.422736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.422760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.286 #46 NEW cov: 12386 ft: 15750 corp: 34/865b lim: 40 exec/s: 46 rss: 74Mb L: 8/40 MS: 1 EraseBytes- 00:08:30.286 [2024-12-05 12:50:33.462995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.463019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.286 [2024-12-05 12:50:33.463076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.463090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.286 [2024-12-05 12:50:33.503106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00110000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.503130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.286 [2024-12-05 12:50:33.503203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.286 [2024-12-05 12:50:33.503216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.286 #48 NEW cov: 12386 ft: 15759 corp: 35/885b lim: 40 exec/s: 24 rss: 74Mb L: 20/40 MS: 2 CMP-CMP- DE: "\000\000\000\010"-"\021\000\000\000"- 00:08:30.286 #48 DONE cov: 12386 ft: 15759 corp: 35/885b lim: 40 exec/s: 24 rss: 74Mb 00:08:30.286 ###### Recommended dictionary. ###### 00:08:30.286 "\000\000\000\000\000\000\000\000" # Uses: 3 00:08:30.286 "\000\000\000\010" # Uses: 0 00:08:30.286 "\021\000\000\000" # Uses: 0 00:08:30.286 ###### End of recommended dictionary. ###### 00:08:30.286 Done 48 runs in 2 second(s) 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:30.546 12:50:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:30.546 [2024-12-05 12:50:33.670402] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:30.546 [2024-12-05 12:50:33.670496] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid154004 ] 00:08:30.807 [2024-12-05 12:50:33.868477] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.807 [2024-12-05 12:50:33.881096] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.807 [2024-12-05 12:50:33.933411] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.807 [2024-12-05 12:50:33.949728] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:30.807 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.807 INFO: Seed: 3946677708 00:08:30.807 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:30.807 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:30.807 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:30.807 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.807 #2 INITED exec/s: 0 rss: 64Mb 00:08:30.807 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.807 This may also happen if the target rejected all inputs we tried so far 00:08:30.807 [2024-12-05 12:50:34.020131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.807 [2024-12-05 12:50:34.020165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.807 [2024-12-05 12:50:34.020313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.807 [2024-12-05 12:50:34.020331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.807 [2024-12-05 12:50:34.020470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.807 [2024-12-05 12:50:34.020488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.067 NEW_FUNC[1/717]: 0x463148 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:31.067 NEW_FUNC[2/717]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.067 #3 NEW cov: 12157 ft: 12156 corp: 2/30b lim: 40 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:31.067 [2024-12-05 12:50:34.360816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.067 [2024-12-05 12:50:34.360859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.067 [2024-12-05 12:50:34.360985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.067 [2024-12-05 12:50:34.361004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.067 [2024-12-05 12:50:34.361122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.067 [2024-12-05 12:50:34.361139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.328 #4 NEW cov: 12270 ft: 12770 corp: 3/59b lim: 40 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 ChangeBinInt- 00:08:31.328 [2024-12-05 12:50:34.431253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.431280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.328 [2024-12-05 12:50:34.431413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffb42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.431430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.328 [2024-12-05 12:50:34.431555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.431572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.328 [2024-12-05 12:50:34.431695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.431711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.328 #5 NEW cov: 12276 ft: 13247 corp: 4/97b lim: 40 exec/s: 0 rss: 73Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:31.328 [2024-12-05 12:50:34.501127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.501154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.328 [2024-12-05 12:50:34.501278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.501294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.328 [2024-12-05 12:50:34.501409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:bfffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.501426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.328 #11 NEW cov: 12361 ft: 13503 corp: 5/126b lim: 40 exec/s: 0 rss: 73Mb L: 29/38 MS: 1 ChangeBit- 00:08:31.328 [2024-12-05 12:50:34.551189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0afff7ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.551217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.328 [2024-12-05 12:50:34.551344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.551360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.328 [2024-12-05 12:50:34.551486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.551503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.328 #12 NEW cov: 12361 ft: 13607 corp: 6/155b lim: 40 exec/s: 0 rss: 73Mb L: 29/38 MS: 1 ChangeBinInt- 00:08:31.328 [2024-12-05 12:50:34.600867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0105f503 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.328 [2024-12-05 12:50:34.600893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.328 #16 NEW cov: 12361 ft: 14377 corp: 7/167b lim: 40 exec/s: 0 rss: 73Mb L: 12/38 MS: 4 CMP-CopyPart-ChangeBit-CMP- DE: "\001\015"-"\365\003\000\000\000\000\000\000"- 00:08:31.589 [2024-12-05 12:50:34.651330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.651357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.651493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:fffffffb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.651509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.589 #19 NEW cov: 12361 ft: 14699 corp: 8/184b lim: 40 exec/s: 0 rss: 73Mb L: 17/38 MS: 3 InsertByte-ChangeBit-CrossOver- 00:08:31.589 [2024-12-05 12:50:34.701958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.701985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.702108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff07ff cdw11:fffffb42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.702126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.702252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.702269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.702395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.702412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.589 #20 NEW cov: 12361 ft: 14734 corp: 9/222b lim: 40 exec/s: 0 rss: 73Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:31.589 [2024-12-05 12:50:34.772357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.772384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.772524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff47ff cdw11:fffffb42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.772541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.772664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.772681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.772807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.772823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.589 #21 NEW cov: 12361 ft: 14838 corp: 10/260b lim: 40 exec/s: 0 rss: 73Mb L: 38/38 MS: 1 ChangeBit- 00:08:31.589 [2024-12-05 12:50:34.842102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.842131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.842265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.842282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.842407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.842425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.589 #22 NEW cov: 12361 ft: 14893 corp: 11/289b lim: 40 exec/s: 0 rss: 73Mb L: 29/38 MS: 1 ChangeBit- 00:08:31.589 [2024-12-05 12:50:34.892004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.892032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.589 [2024-12-05 12:50:34.892153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.589 [2024-12-05 12:50:34.892172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.850 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:31.851 #23 NEW cov: 12384 ft: 14969 corp: 12/311b lim: 40 exec/s: 0 rss: 73Mb L: 22/38 MS: 1 EraseBytes- 00:08:31.851 [2024-12-05 12:50:34.942359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0afff7ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:34.942389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.851 [2024-12-05 12:50:34.942521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:34.942543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.851 [2024-12-05 12:50:34.942672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:34.942689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.851 #24 NEW cov: 12384 ft: 15005 corp: 13/340b lim: 40 exec/s: 0 rss: 73Mb L: 29/38 MS: 1 ChangeBit- 00:08:31.851 [2024-12-05 12:50:34.992544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:34.992571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.851 [2024-12-05 12:50:34.992709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:34.992727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.851 [2024-12-05 12:50:34.992853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:34.992872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.851 #25 NEW cov: 12384 ft: 15043 corp: 14/369b lim: 40 exec/s: 25 rss: 73Mb L: 29/38 MS: 1 ChangeByte- 00:08:31.851 [2024-12-05 12:50:35.042676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0afff7ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:35.042704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.851 [2024-12-05 12:50:35.042830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:35.042853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.851 [2024-12-05 12:50:35.042976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:35.042993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.851 #26 NEW cov: 12384 ft: 15073 corp: 15/399b lim: 40 exec/s: 26 rss: 73Mb L: 30/38 MS: 1 InsertByte- 00:08:31.851 [2024-12-05 12:50:35.112666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:35.112696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.851 [2024-12-05 12:50:35.112821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffef cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.851 [2024-12-05 12:50:35.112843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.851 #32 NEW cov: 12384 ft: 15098 corp: 16/415b lim: 40 exec/s: 32 rss: 73Mb L: 16/38 MS: 1 EraseBytes- 00:08:32.111 [2024-12-05 12:50:35.183502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.183532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.111 [2024-12-05 12:50:35.183658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff47ff cdw11:fffffb42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.183678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.111 [2024-12-05 12:50:35.183804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:42424226 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.183822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.111 [2024-12-05 12:50:35.183962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.183979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.111 #33 NEW cov: 12384 ft: 15147 corp: 17/453b lim: 40 exec/s: 33 rss: 73Mb L: 38/38 MS: 1 ChangeByte- 00:08:32.111 [2024-12-05 12:50:35.252909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0105f503 cdw11:00800000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.252935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.111 #34 NEW cov: 12384 ft: 15186 corp: 18/465b lim: 40 exec/s: 34 rss: 73Mb L: 12/38 MS: 1 ChangeBit- 00:08:32.111 [2024-12-05 12:50:35.323966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.323991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.111 [2024-12-05 12:50:35.324113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff47ff cdw11:fffffb42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.324130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.111 [2024-12-05 12:50:35.324258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:42424226 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.324273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.111 [2024-12-05 12:50:35.324390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:32ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.324408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.111 #35 NEW cov: 12384 ft: 15246 corp: 19/503b lim: 40 exec/s: 35 rss: 73Mb L: 38/38 MS: 1 ChangeByte- 00:08:32.111 [2024-12-05 12:50:35.394008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.111 [2024-12-05 12:50:35.394034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.112 [2024-12-05 12:50:35.394159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff47ff cdw11:fffffb42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.112 [2024-12-05 12:50:35.394179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.112 [2024-12-05 12:50:35.394306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:42424226 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.112 [2024-12-05 12:50:35.394323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.112 [2024-12-05 12:50:35.394446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:49000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.112 [2024-12-05 12:50:35.394465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.112 #36 NEW cov: 12384 ft: 15281 corp: 20/541b lim: 40 exec/s: 36 rss: 74Mb L: 38/38 MS: 1 CMP- DE: "I\000\000\000\000\000\000\000"- 00:08:32.372 [2024-12-05 12:50:35.443675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.443701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.372 [2024-12-05 12:50:35.443828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:fffffffb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.443849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.372 #37 NEW cov: 12384 ft: 15293 corp: 21/560b lim: 40 exec/s: 37 rss: 74Mb L: 19/38 MS: 1 EraseBytes- 00:08:32.372 [2024-12-05 12:50:35.513877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.513903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.372 [2024-12-05 12:50:35.514040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.514057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.372 #38 NEW cov: 12384 ft: 15315 corp: 22/583b lim: 40 exec/s: 38 rss: 74Mb L: 23/38 MS: 1 CopyPart- 00:08:32.372 [2024-12-05 12:50:35.584082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0105f503 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.584108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.372 [2024-12-05 12:50:35.584246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:10000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.584264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.372 #39 NEW cov: 12384 ft: 15360 corp: 23/603b lim: 40 exec/s: 39 rss: 74Mb L: 20/38 MS: 1 CMP- DE: "\020\000\000\000\000\000\000\000"- 00:08:32.372 [2024-12-05 12:50:35.634783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.634808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.372 [2024-12-05 12:50:35.634951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff07ff cdw11:fffffb42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.634969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.372 [2024-12-05 12:50:35.635096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.635114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.372 [2024-12-05 12:50:35.635241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:f5030000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.635261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.372 #40 NEW cov: 12384 ft: 15388 corp: 24/641b lim: 40 exec/s: 40 rss: 74Mb L: 38/38 MS: 1 PersAutoDict- DE: "\365\003\000\000\000\000\000\000"- 00:08:32.372 [2024-12-05 12:50:35.684251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0105f503 cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.372 [2024-12-05 12:50:35.684280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.632 #41 NEW cov: 12384 ft: 15415 corp: 25/653b lim: 40 exec/s: 41 rss: 74Mb L: 12/38 MS: 1 ChangeBit- 00:08:32.632 [2024-12-05 12:50:35.734818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.734851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.632 [2024-12-05 12:50:35.734988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.735006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.632 [2024-12-05 12:50:35.735133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:1d000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.735149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.632 #42 NEW cov: 12384 ft: 15429 corp: 26/682b lim: 40 exec/s: 42 rss: 74Mb L: 29/38 MS: 1 ChangeBinInt- 00:08:32.632 [2024-12-05 12:50:35.804755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff7f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.804782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.632 [2024-12-05 12:50:35.804911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.804928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.632 #43 NEW cov: 12384 ft: 15452 corp: 27/705b lim: 40 exec/s: 43 rss: 74Mb L: 23/38 MS: 1 ChangeBit- 00:08:32.632 [2024-12-05 12:50:35.875205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff7f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.875233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.632 [2024-12-05 12:50:35.875367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.875384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.632 [2024-12-05 12:50:35.875512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8ffffbff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.875530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.632 #44 NEW cov: 12384 ft: 15479 corp: 28/729b lim: 40 exec/s: 44 rss: 74Mb L: 24/38 MS: 1 InsertByte- 00:08:32.632 [2024-12-05 12:50:35.945097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.945124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.632 [2024-12-05 12:50:35.945258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.632 [2024-12-05 12:50:35.945277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.892 #45 NEW cov: 12384 ft: 15498 corp: 29/751b lim: 40 exec/s: 45 rss: 74Mb L: 22/38 MS: 1 ShuffleBytes- 00:08:32.892 [2024-12-05 12:50:35.995745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.892 [2024-12-05 12:50:35.995770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:32.892 [2024-12-05 12:50:35.995893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff07ff cdw11:fffffb42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.892 [2024-12-05 12:50:35.995911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:32.892 [2024-12-05 12:50:35.996033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.892 [2024-12-05 12:50:35.996050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:32.892 [2024-12-05 12:50:35.996166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:f5430000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.892 [2024-12-05 12:50:35.996183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:32.892 #46 NEW cov: 12384 ft: 15511 corp: 30/789b lim: 40 exec/s: 23 rss: 74Mb L: 38/38 MS: 1 ChangeBit- 00:08:32.892 #46 DONE cov: 12384 ft: 15511 corp: 30/789b lim: 40 exec/s: 23 rss: 74Mb 00:08:32.892 ###### Recommended dictionary. ###### 00:08:32.892 "\001\015" # Uses: 0 00:08:32.892 "\365\003\000\000\000\000\000\000" # Uses: 1 00:08:32.892 "I\000\000\000\000\000\000\000" # Uses: 0 00:08:32.892 "\020\000\000\000\000\000\000\000" # Uses: 0 00:08:32.892 ###### End of recommended dictionary. ###### 00:08:32.892 Done 46 runs in 2 second(s) 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:32.892 12:50:36 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:32.892 [2024-12-05 12:50:36.180729] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:32.892 [2024-12-05 12:50:36.180799] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid154362 ] 00:08:33.152 [2024-12-05 12:50:36.380526] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.152 [2024-12-05 12:50:36.393837] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.152 [2024-12-05 12:50:36.446592] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.152 [2024-12-05 12:50:36.462906] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:33.412 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.412 INFO: Seed: 2165727325 00:08:33.412 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:33.412 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:33.412 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:33.412 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.412 #2 INITED exec/s: 0 rss: 64Mb 00:08:33.412 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.412 This may also happen if the target rejected all inputs we tried so far 00:08:33.412 [2024-12-05 12:50:36.529347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.412 [2024-12-05 12:50:36.529385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.412 [2024-12-05 12:50:36.529525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.412 [2024-12-05 12:50:36.529542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.412 [2024-12-05 12:50:36.529676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:84840aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.412 [2024-12-05 12:50:36.529694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.671 NEW_FUNC[1/716]: 0x464d18 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:33.671 NEW_FUNC[2/716]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.671 #6 NEW cov: 12143 ft: 12146 corp: 2/25b lim: 40 exec/s: 0 rss: 72Mb L: 24/24 MS: 4 CopyPart-EraseBytes-InsertByte-InsertRepeatedBytes- 00:08:33.671 [2024-12-05 12:50:36.870123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.671 [2024-12-05 12:50:36.870174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.671 #12 NEW cov: 12258 ft: 13045 corp: 3/34b lim: 40 exec/s: 0 rss: 72Mb L: 9/24 MS: 1 CrossOver- 00:08:33.671 [2024-12-05 12:50:36.920034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a848400 cdw11:00000984 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.671 [2024-12-05 12:50:36.920069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.671 #13 NEW cov: 12264 ft: 13193 corp: 4/43b lim: 40 exec/s: 0 rss: 72Mb L: 9/24 MS: 1 ChangeBinInt- 00:08:33.930 [2024-12-05 12:50:36.990235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a848400 cdw11:00000984 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:36.990265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.930 #14 NEW cov: 12349 ft: 13502 corp: 5/52b lim: 40 exec/s: 0 rss: 72Mb L: 9/24 MS: 1 ChangeByte- 00:08:33.930 [2024-12-05 12:50:37.060352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.060382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.930 #15 NEW cov: 12349 ft: 13573 corp: 6/66b lim: 40 exec/s: 0 rss: 73Mb L: 14/24 MS: 1 InsertRepeatedBytes- 00:08:33.930 [2024-12-05 12:50:37.110778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ab1b1b1 cdw11:b1b1b1b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.110806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.930 [2024-12-05 12:50:37.110949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b1b18484 cdw11:00000009 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.110966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.930 #16 NEW cov: 12349 ft: 13829 corp: 7/84b lim: 40 exec/s: 0 rss: 73Mb L: 18/24 MS: 1 InsertRepeatedBytes- 00:08:33.930 [2024-12-05 12:50:37.181340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.181367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.930 [2024-12-05 12:50:37.181515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.181531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:33.930 [2024-12-05 12:50:37.181657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.181673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:33.930 [2024-12-05 12:50:37.181819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.181839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:33.930 #22 NEW cov: 12349 ft: 14428 corp: 8/117b lim: 40 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:33.930 [2024-12-05 12:50:37.231056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ab1b1b1 cdw11:b1b3b1b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.231084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.930 [2024-12-05 12:50:37.231222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b1b18484 cdw11:00000009 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.930 [2024-12-05 12:50:37.231239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.190 #23 NEW cov: 12349 ft: 14576 corp: 9/135b lim: 40 exec/s: 0 rss: 73Mb L: 18/33 MS: 1 ChangeBit- 00:08:34.190 [2024-12-05 12:50:37.301010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a848400 cdw11:0000092e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.190 [2024-12-05 12:50:37.301038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.190 #24 NEW cov: 12349 ft: 14688 corp: 10/144b lim: 40 exec/s: 0 rss: 73Mb L: 9/33 MS: 1 ChangeByte- 00:08:34.190 [2024-12-05 12:50:37.351241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a848400 cdw11:00000984 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.190 [2024-12-05 12:50:37.351270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.190 #25 NEW cov: 12349 ft: 14756 corp: 11/155b lim: 40 exec/s: 0 rss: 73Mb L: 11/33 MS: 1 CrossOver- 00:08:34.190 [2024-12-05 12:50:37.421486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a84840a cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.190 [2024-12-05 12:50:37.421514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.190 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:34.190 #26 NEW cov: 12372 ft: 14808 corp: 12/164b lim: 40 exec/s: 0 rss: 73Mb L: 9/33 MS: 1 CopyPart- 00:08:34.190 [2024-12-05 12:50:37.471607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00f7ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.190 [2024-12-05 12:50:37.471637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.449 #27 NEW cov: 12372 ft: 14844 corp: 13/178b lim: 40 exec/s: 27 rss: 73Mb L: 14/33 MS: 1 ChangeBinInt- 00:08:34.449 [2024-12-05 12:50:37.541775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3f7ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.449 [2024-12-05 12:50:37.541803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.449 #28 NEW cov: 12372 ft: 14874 corp: 14/192b lim: 40 exec/s: 28 rss: 73Mb L: 14/33 MS: 1 ChangeByte- 00:08:34.449 [2024-12-05 12:50:37.612009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a848400 cdw11:00000960 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.449 [2024-12-05 12:50:37.612037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.449 #29 NEW cov: 12372 ft: 14955 corp: 15/200b lim: 40 exec/s: 29 rss: 73Mb L: 8/33 MS: 1 EraseBytes- 00:08:34.449 [2024-12-05 12:50:37.682635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.449 [2024-12-05 12:50:37.682663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.449 [2024-12-05 12:50:37.682795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.449 [2024-12-05 12:50:37.682812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.449 [2024-12-05 12:50:37.682947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.449 [2024-12-05 12:50:37.682963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.449 #30 NEW cov: 12372 ft: 14976 corp: 16/227b lim: 40 exec/s: 30 rss: 73Mb L: 27/33 MS: 1 EraseBytes- 00:08:34.449 [2024-12-05 12:50:37.752604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.449 [2024-12-05 12:50:37.752633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.449 [2024-12-05 12:50:37.752777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.449 [2024-12-05 12:50:37.752794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.707 #31 NEW cov: 12372 ft: 15040 corp: 17/248b lim: 40 exec/s: 31 rss: 74Mb L: 21/33 MS: 1 EraseBytes- 00:08:34.707 [2024-12-05 12:50:37.822663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0c8400 cdw11:00000984 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.707 [2024-12-05 12:50:37.822689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.707 #32 NEW cov: 12372 ft: 15071 corp: 18/257b lim: 40 exec/s: 32 rss: 74Mb L: 9/33 MS: 1 ChangeByte- 00:08:34.707 [2024-12-05 12:50:37.872731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3f7ffff cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.707 [2024-12-05 12:50:37.872760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.707 #33 NEW cov: 12372 ft: 15116 corp: 19/271b lim: 40 exec/s: 33 rss: 74Mb L: 14/33 MS: 1 ShuffleBytes- 00:08:34.707 [2024-12-05 12:50:37.943037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00f7ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.707 [2024-12-05 12:50:37.943064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.707 #34 NEW cov: 12372 ft: 15120 corp: 20/285b lim: 40 exec/s: 34 rss: 74Mb L: 14/33 MS: 1 ShuffleBytes- 00:08:34.707 [2024-12-05 12:50:37.993770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3f7ffff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.707 [2024-12-05 12:50:37.993800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.707 [2024-12-05 12:50:37.993920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.707 [2024-12-05 12:50:37.993937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.707 [2024-12-05 12:50:37.994071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.707 [2024-12-05 12:50:37.994087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.707 [2024-12-05 12:50:37.994215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.707 [2024-12-05 12:50:37.994233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.965 #35 NEW cov: 12372 ft: 15151 corp: 21/320b lim: 40 exec/s: 35 rss: 74Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:34.965 [2024-12-05 12:50:38.064003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.064032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.965 [2024-12-05 12:50:38.064174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.064193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.965 [2024-12-05 12:50:38.064319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:8484840a cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.064349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:34.965 [2024-12-05 12:50:38.064481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:84848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.064498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:34.965 #36 NEW cov: 12372 ft: 15170 corp: 22/359b lim: 40 exec/s: 36 rss: 74Mb L: 39/39 MS: 1 CopyPart- 00:08:34.965 [2024-12-05 12:50:38.133659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3f7ffff cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.133690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.965 #37 NEW cov: 12372 ft: 15182 corp: 23/373b lim: 40 exec/s: 37 rss: 74Mb L: 14/39 MS: 1 ChangeByte- 00:08:34.965 [2024-12-05 12:50:38.183947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ab1b1b1 cdw11:b1b3b102 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.183974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.965 [2024-12-05 12:50:38.184113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000009 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.184132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.965 #38 NEW cov: 12372 ft: 15240 corp: 24/391b lim: 40 exec/s: 38 rss: 74Mb L: 18/39 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:34.965 [2024-12-05 12:50:38.254173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a84840a cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.254202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.965 [2024-12-05 12:50:38.254332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:84020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.965 [2024-12-05 12:50:38.254350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.224 #39 NEW cov: 12372 ft: 15262 corp: 25/408b lim: 40 exec/s: 39 rss: 74Mb L: 17/39 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:35.224 [2024-12-05 12:50:38.324395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a848400 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.324424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.224 [2024-12-05 12:50:38.324558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00001100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.324590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.224 #40 NEW cov: 12372 ft: 15299 corp: 26/425b lim: 40 exec/s: 40 rss: 74Mb L: 17/39 MS: 1 ChangeBinInt- 00:08:35.224 [2024-12-05 12:50:38.395053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3f7ff59 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.395083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.224 [2024-12-05 12:50:38.395220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.395237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.224 [2024-12-05 12:50:38.395372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.395393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:35.224 [2024-12-05 12:50:38.395528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.395546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:35.224 #41 NEW cov: 12372 ft: 15326 corp: 27/461b lim: 40 exec/s: 41 rss: 74Mb L: 36/39 MS: 1 InsertByte- 00:08:35.224 [2024-12-05 12:50:38.464804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ab1b1b1 cdw11:b1b1b1b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.464831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.224 [2024-12-05 12:50:38.464968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:b1b18484 cdw11:00000009 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.464983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.224 #42 NEW cov: 12372 ft: 15344 corp: 28/479b lim: 40 exec/s: 42 rss: 74Mb L: 18/39 MS: 1 ChangeByte- 00:08:35.224 [2024-12-05 12:50:38.514755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a84e300 cdw11:00000984 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:35.224 [2024-12-05 12:50:38.514784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.224 #43 NEW cov: 12372 ft: 15385 corp: 29/490b lim: 40 exec/s: 21 rss: 74Mb L: 11/39 MS: 1 ChangeByte- 00:08:35.224 #43 DONE cov: 12372 ft: 15385 corp: 29/490b lim: 40 exec/s: 21 rss: 74Mb 00:08:35.224 ###### Recommended dictionary. ###### 00:08:35.224 "\002\000\000\000\000\000\000\000" # Uses: 1 00:08:35.224 ###### End of recommended dictionary. ###### 00:08:35.224 Done 43 runs in 2 second(s) 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:35.483 12:50:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:35.483 [2024-12-05 12:50:38.681817] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:35.483 [2024-12-05 12:50:38.681906] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid154896 ] 00:08:35.741 [2024-12-05 12:50:38.882052] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.742 [2024-12-05 12:50:38.894767] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.742 [2024-12-05 12:50:38.947131] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:35.742 [2024-12-05 12:50:38.963423] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:35.742 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.742 INFO: Seed: 372768609 00:08:35.742 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:35.742 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:35.742 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:35.742 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.742 #2 INITED exec/s: 0 rss: 64Mb 00:08:35.742 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.742 This may also happen if the target rejected all inputs we tried so far 00:08:36.258 NEW_FUNC[1/703]: 0x4668e8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:36.258 NEW_FUNC[2/703]: 0x488308 in feat_async_event_cfg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:346 00:08:36.258 #6 NEW cov: 12100 ft: 12083 corp: 2/13b lim: 35 exec/s: 0 rss: 72Mb L: 12/12 MS: 4 ChangeBit-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:36.258 #7 NEW cov: 12213 ft: 12681 corp: 3/26b lim: 35 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 InsertByte- 00:08:36.258 #13 NEW cov: 12219 ft: 13004 corp: 4/38b lim: 35 exec/s: 0 rss: 72Mb L: 12/13 MS: 1 CopyPart- 00:08:36.258 #14 NEW cov: 12304 ft: 13183 corp: 5/51b lim: 35 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 InsertByte- 00:08:36.258 [2024-12-05 12:50:39.513805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.258 [2024-12-05 12:50:39.513843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.258 [2024-12-05 12:50:39.513915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.258 [2024-12-05 12:50:39.513931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.258 [2024-12-05 12:50:39.513986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.258 [2024-12-05 12:50:39.514008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:36.258 NEW_FUNC[1/15]: 0x19668c8 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:08:36.258 NEW_FUNC[2/15]: 0x1966b08 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:08:36.258 #15 NEW cov: 12443 ft: 13997 corp: 6/73b lim: 35 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 CrossOver- 00:08:36.517 #21 NEW cov: 12443 ft: 14035 corp: 7/85b lim: 35 exec/s: 0 rss: 72Mb L: 12/22 MS: 1 ShuffleBytes- 00:08:36.517 #22 NEW cov: 12443 ft: 14196 corp: 8/97b lim: 35 exec/s: 0 rss: 72Mb L: 12/22 MS: 1 CrossOver- 00:08:36.517 #23 NEW cov: 12443 ft: 14224 corp: 9/110b lim: 35 exec/s: 0 rss: 72Mb L: 13/22 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:36.517 [2024-12-05 12:50:39.694299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.517 [2024-12-05 12:50:39.694331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.517 [2024-12-05 12:50:39.694404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.517 [2024-12-05 12:50:39.694421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.517 [2024-12-05 12:50:39.694480] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.517 [2024-12-05 12:50:39.694494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:36.517 #24 NEW cov: 12450 ft: 14339 corp: 10/133b lim: 35 exec/s: 0 rss: 73Mb L: 23/23 MS: 1 CrossOver- 00:08:36.517 #25 NEW cov: 12450 ft: 14407 corp: 11/145b lim: 35 exec/s: 0 rss: 73Mb L: 12/23 MS: 1 ChangeBit- 00:08:36.777 #26 NEW cov: 12450 ft: 14423 corp: 12/158b lim: 35 exec/s: 0 rss: 73Mb L: 13/23 MS: 1 ChangeBit- 00:08:36.777 #27 NEW cov: 12450 ft: 14520 corp: 13/170b lim: 35 exec/s: 0 rss: 73Mb L: 12/23 MS: 1 ChangeByte- 00:08:36.777 [2024-12-05 12:50:39.914849] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.777 [2024-12-05 12:50:39.914878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.777 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:36.777 #28 NEW cov: 12473 ft: 14796 corp: 14/184b lim: 35 exec/s: 0 rss: 73Mb L: 14/23 MS: 1 InsertByte- 00:08:36.777 [2024-12-05 12:50:39.975218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.777 [2024-12-05 12:50:39.975245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:36.777 #29 NEW cov: 12473 ft: 14957 corp: 15/210b lim: 35 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 CopyPart- 00:08:36.777 [2024-12-05 12:50:40.015163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.777 [2024-12-05 12:50:40.015193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.777 #30 NEW cov: 12473 ft: 14966 corp: 16/224b lim: 35 exec/s: 30 rss: 73Mb L: 14/26 MS: 1 InsertByte- 00:08:36.777 #31 NEW cov: 12473 ft: 14982 corp: 17/233b lim: 35 exec/s: 31 rss: 73Mb L: 9/26 MS: 1 CrossOver- 00:08:37.037 #32 NEW cov: 12473 ft: 15029 corp: 18/245b lim: 35 exec/s: 32 rss: 73Mb L: 12/26 MS: 1 ChangeByte- 00:08:37.037 [2024-12-05 12:50:40.155697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.037 [2024-12-05 12:50:40.155730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.037 [2024-12-05 12:50:40.155803] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.037 [2024-12-05 12:50:40.155819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.037 #33 NEW cov: 12473 ft: 15147 corp: 19/268b lim: 35 exec/s: 33 rss: 73Mb L: 23/26 MS: 1 CopyPart- 00:08:37.037 [2024-12-05 12:50:40.195808] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.037 [2024-12-05 12:50:40.195837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.037 #34 NEW cov: 12473 ft: 15173 corp: 20/294b lim: 35 exec/s: 34 rss: 73Mb L: 26/26 MS: 1 ChangeBinInt- 00:08:37.037 [2024-12-05 12:50:40.256084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.037 [2024-12-05 12:50:40.256126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.037 #35 NEW cov: 12473 ft: 15201 corp: 21/320b lim: 35 exec/s: 35 rss: 73Mb L: 26/26 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:37.298 #36 NEW cov: 12473 ft: 15263 corp: 22/328b lim: 35 exec/s: 36 rss: 73Mb L: 8/26 MS: 1 EraseBytes- 00:08:37.298 #37 NEW cov: 12473 ft: 15280 corp: 23/341b lim: 35 exec/s: 37 rss: 73Mb L: 13/26 MS: 1 ChangeBit- 00:08:37.298 [2024-12-05 12:50:40.416272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.298 [2024-12-05 12:50:40.416298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.298 #38 NEW cov: 12473 ft: 15300 corp: 24/355b lim: 35 exec/s: 38 rss: 73Mb L: 14/26 MS: 1 InsertByte- 00:08:37.298 [2024-12-05 12:50:40.456463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.298 [2024-12-05 12:50:40.456490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.298 [2024-12-05 12:50:40.456548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.298 [2024-12-05 12:50:40.456564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.298 [2024-12-05 12:50:40.456620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.298 [2024-12-05 12:50:40.456636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.298 #39 NEW cov: 12473 ft: 15308 corp: 25/377b lim: 35 exec/s: 39 rss: 73Mb L: 22/26 MS: 1 ChangeBinInt- 00:08:37.298 [2024-12-05 12:50:40.496849] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.298 [2024-12-05 12:50:40.496876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.298 [2024-12-05 12:50:40.496927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ASYNC EVENT CONFIGURATION cid:7 cdw10:8000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.298 [2024-12-05 12:50:40.496943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.298 #40 NEW cov: 12473 ft: 15584 corp: 26/406b lim: 35 exec/s: 40 rss: 73Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:37.298 #41 NEW cov: 12473 ft: 15595 corp: 27/419b lim: 35 exec/s: 41 rss: 73Mb L: 13/29 MS: 1 ChangeByte- 00:08:37.298 [2024-12-05 12:50:40.576530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.298 [2024-12-05 12:50:40.576557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.558 #45 NEW cov: 12473 ft: 15601 corp: 28/429b lim: 35 exec/s: 45 rss: 74Mb L: 10/29 MS: 4 EraseBytes-ChangeByte-ChangeByte-CrossOver- 00:08:37.558 [2024-12-05 12:50:40.637121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.558 [2024-12-05 12:50:40.637147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.558 #46 NEW cov: 12473 ft: 15603 corp: 29/455b lim: 35 exec/s: 46 rss: 74Mb L: 26/29 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:37.558 [2024-12-05 12:50:40.677328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.558 [2024-12-05 12:50:40.677354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.558 [2024-12-05 12:50:40.677449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.558 [2024-12-05 12:50:40.677464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.558 #47 NEW cov: 12473 ft: 15606 corp: 30/485b lim: 35 exec/s: 47 rss: 74Mb L: 30/30 MS: 1 CrossOver- 00:08:37.558 #48 NEW cov: 12473 ft: 15662 corp: 31/498b lim: 35 exec/s: 48 rss: 74Mb L: 13/30 MS: 1 CopyPart- 00:08:37.558 #49 NEW cov: 12473 ft: 15667 corp: 32/510b lim: 35 exec/s: 49 rss: 74Mb L: 12/30 MS: 1 ChangeBinInt- 00:08:37.558 [2024-12-05 12:50:40.797125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.558 [2024-12-05 12:50:40.797153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.558 #50 NEW cov: 12473 ft: 15689 corp: 33/520b lim: 35 exec/s: 50 rss: 74Mb L: 10/30 MS: 1 EraseBytes- 00:08:37.819 #51 NEW cov: 12473 ft: 15691 corp: 34/533b lim: 35 exec/s: 51 rss: 74Mb L: 13/30 MS: 1 ChangeBinInt- 00:08:37.819 [2024-12-05 12:50:40.897732] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.819 [2024-12-05 12:50:40.897758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.819 [2024-12-05 12:50:40.897813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.819 [2024-12-05 12:50:40.897829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.819 [2024-12-05 12:50:40.897904] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.819 [2024-12-05 12:50:40.897920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.819 #52 NEW cov: 12473 ft: 15703 corp: 35/557b lim: 35 exec/s: 52 rss: 74Mb L: 24/30 MS: 1 InsertByte- 00:08:37.819 #53 NEW cov: 12473 ft: 15713 corp: 36/569b lim: 35 exec/s: 53 rss: 74Mb L: 12/30 MS: 1 CopyPart- 00:08:37.819 [2024-12-05 12:50:41.017949] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000de SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.819 [2024-12-05 12:50:41.017977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.819 #54 NEW cov: 12473 ft: 15733 corp: 37/587b lim: 35 exec/s: 27 rss: 74Mb L: 18/30 MS: 1 InsertRepeatedBytes- 00:08:37.819 #54 DONE cov: 12473 ft: 15733 corp: 37/587b lim: 35 exec/s: 27 rss: 74Mb 00:08:37.819 ###### Recommended dictionary. ###### 00:08:37.819 "\377\377\377\377\377\377\377\377" # Uses: 2 00:08:37.819 ###### End of recommended dictionary. ###### 00:08:37.819 Done 54 runs in 2 second(s) 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:38.080 12:50:41 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:38.080 [2024-12-05 12:50:41.204678] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:38.080 [2024-12-05 12:50:41.204749] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155277 ] 00:08:38.340 [2024-12-05 12:50:41.410392] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.340 [2024-12-05 12:50:41.423295] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.340 [2024-12-05 12:50:41.475729] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.340 [2024-12-05 12:50:41.492090] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:38.340 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.340 INFO: Seed: 2900743350 00:08:38.340 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:38.340 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:38.340 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:38.340 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.340 #2 INITED exec/s: 0 rss: 64Mb 00:08:38.340 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.340 This may also happen if the target rejected all inputs we tried so far 00:08:38.340 [2024-12-05 12:50:41.557614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.340 [2024-12-05 12:50:41.557645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.340 [2024-12-05 12:50:41.557700] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.340 [2024-12-05 12:50:41.557714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.340 [2024-12-05 12:50:41.557768] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.340 [2024-12-05 12:50:41.557782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.601 NEW_FUNC[1/716]: 0x467e28 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:38.601 NEW_FUNC[2/716]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:38.601 #9 NEW cov: 12127 ft: 12128 corp: 2/23b lim: 35 exec/s: 0 rss: 72Mb L: 22/22 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:38.601 [2024-12-05 12:50:41.888680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.601 [2024-12-05 12:50:41.888734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.601 [2024-12-05 12:50:41.888823] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.601 [2024-12-05 12:50:41.888872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.601 [2024-12-05 12:50:41.888956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.601 [2024-12-05 12:50:41.888981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.861 #10 NEW cov: 12240 ft: 12827 corp: 3/46b lim: 35 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 InsertByte- 00:08:38.861 [2024-12-05 12:50:41.948546] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:41.948572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:41.948646] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:41.948660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:41.948721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:41.948735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.861 #11 NEW cov: 12246 ft: 13065 corp: 4/70b lim: 35 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 InsertByte- 00:08:38.861 NEW_FUNC[1/1]: 0x487e38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:38.861 #13 NEW cov: 12345 ft: 13526 corp: 5/79b lim: 35 exec/s: 0 rss: 72Mb L: 9/24 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:38.861 [2024-12-05 12:50:42.048787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.048812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:42.048890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.048909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:42.048967] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.048980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.861 #14 NEW cov: 12345 ft: 13741 corp: 6/101b lim: 35 exec/s: 0 rss: 72Mb L: 22/24 MS: 1 ChangeByte- 00:08:38.861 [2024-12-05 12:50:42.089050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.089076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:42.089134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.089148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:42.089207] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.089220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:42.089276] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.089289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.861 #15 NEW cov: 12345 ft: 14294 corp: 7/134b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:38.861 [2024-12-05 12:50:42.149128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.149153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:42.149228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.861 [2024-12-05 12:50:42.149242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.861 [2024-12-05 12:50:42.149302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.862 [2024-12-05 12:50:42.149316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.862 #16 NEW cov: 12345 ft: 14336 corp: 8/156b lim: 35 exec/s: 0 rss: 72Mb L: 22/33 MS: 1 ChangeByte- 00:08:39.122 [2024-12-05 12:50:42.189179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.189204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.189263] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.189277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.189335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.189348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.122 #17 NEW cov: 12345 ft: 14400 corp: 9/178b lim: 35 exec/s: 0 rss: 72Mb L: 22/33 MS: 1 ChangeByte- 00:08:39.122 [2024-12-05 12:50:42.229342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.229367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.229427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.229440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.229499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.229512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.122 #18 NEW cov: 12345 ft: 14464 corp: 10/200b lim: 35 exec/s: 0 rss: 72Mb L: 22/33 MS: 1 ChangeByte- 00:08:39.122 [2024-12-05 12:50:42.289521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.289545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.289622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.289636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.289696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.289709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.122 #19 NEW cov: 12345 ft: 14550 corp: 11/221b lim: 35 exec/s: 0 rss: 72Mb L: 21/33 MS: 1 EraseBytes- 00:08:39.122 [2024-12-05 12:50:42.349664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.349689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.349749] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.349762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.349820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.349837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.122 #20 NEW cov: 12345 ft: 14565 corp: 12/243b lim: 35 exec/s: 0 rss: 72Mb L: 22/33 MS: 1 CopyPart- 00:08:39.122 [2024-12-05 12:50:42.389782] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.389807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.389868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.389882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.122 [2024-12-05 12:50:42.389942] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.122 [2024-12-05 12:50:42.389954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.122 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:39.122 #21 NEW cov: 12368 ft: 14574 corp: 13/265b lim: 35 exec/s: 0 rss: 73Mb L: 22/33 MS: 1 ShuffleBytes- 00:08:39.383 [2024-12-05 12:50:42.449917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.449943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.450005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000248 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.450019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.450092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.450107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.383 #22 NEW cov: 12368 ft: 14599 corp: 14/290b lim: 35 exec/s: 0 rss: 73Mb L: 25/33 MS: 1 InsertRepeatedBytes- 00:08:39.383 [2024-12-05 12:50:42.510086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.510112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.510171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.510184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.510241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.510254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.383 #23 NEW cov: 12368 ft: 14717 corp: 15/313b lim: 35 exec/s: 23 rss: 73Mb L: 23/33 MS: 1 InsertByte- 00:08:39.383 [2024-12-05 12:50:42.570441] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.570467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.570545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.570559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.570620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.570633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.570691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.570704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.383 #24 NEW cov: 12368 ft: 14730 corp: 16/344b lim: 35 exec/s: 24 rss: 73Mb L: 31/33 MS: 1 CrossOver- 00:08:39.383 [2024-12-05 12:50:42.630431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.630457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.630524] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.630538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.383 [2024-12-05 12:50:42.630598] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.383 [2024-12-05 12:50:42.630612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.383 #25 NEW cov: 12368 ft: 14748 corp: 17/369b lim: 35 exec/s: 25 rss: 73Mb L: 25/33 MS: 1 ShuffleBytes- 00:08:39.383 [2024-12-05 12:50:42.690723] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.384 [2024-12-05 12:50:42.690748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.384 [2024-12-05 12:50:42.690810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.384 [2024-12-05 12:50:42.690823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.384 [2024-12-05 12:50:42.690903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.384 [2024-12-05 12:50:42.690917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.384 [2024-12-05 12:50:42.690974] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.384 [2024-12-05 12:50:42.690988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.644 #26 NEW cov: 12368 ft: 14763 corp: 18/402b lim: 35 exec/s: 26 rss: 73Mb L: 33/33 MS: 1 ChangeBit- 00:08:39.644 [2024-12-05 12:50:42.750735] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.750761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.644 [2024-12-05 12:50:42.750822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000748 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.750840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.644 [2024-12-05 12:50:42.750900] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.750914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.644 #27 NEW cov: 12368 ft: 14834 corp: 19/427b lim: 35 exec/s: 27 rss: 73Mb L: 25/33 MS: 1 ChangeBinInt- 00:08:39.644 [2024-12-05 12:50:42.811055] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.811080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.644 [2024-12-05 12:50:42.811138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.811152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.644 [2024-12-05 12:50:42.811208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.811224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.644 [2024-12-05 12:50:42.811282] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000004ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.811296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.644 #28 NEW cov: 12368 ft: 14840 corp: 20/460b lim: 35 exec/s: 28 rss: 73Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:39.644 [2024-12-05 12:50:42.851037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.851062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.644 [2024-12-05 12:50:42.851138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007b9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.851152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.644 [2024-12-05 12:50:42.851209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.851223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.644 #29 NEW cov: 12368 ft: 14886 corp: 21/483b lim: 35 exec/s: 29 rss: 73Mb L: 23/33 MS: 1 InsertByte- 00:08:39.644 [2024-12-05 12:50:42.891303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.644 [2024-12-05 12:50:42.891328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.645 [2024-12-05 12:50:42.891389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.645 [2024-12-05 12:50:42.891403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.645 [2024-12-05 12:50:42.891458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.645 [2024-12-05 12:50:42.891471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.645 [2024-12-05 12:50:42.891531] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.645 [2024-12-05 12:50:42.891545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.645 #30 NEW cov: 12368 ft: 14898 corp: 22/516b lim: 35 exec/s: 30 rss: 74Mb L: 33/33 MS: 1 ChangeBinInt- 00:08:39.645 [2024-12-05 12:50:42.951446] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.645 [2024-12-05 12:50:42.951471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.645 [2024-12-05 12:50:42.951530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.645 [2024-12-05 12:50:42.951544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.645 [2024-12-05 12:50:42.951602] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.645 [2024-12-05 12:50:42.951615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.645 [2024-12-05 12:50:42.951673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.645 [2024-12-05 12:50:42.951689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.906 #31 NEW cov: 12368 ft: 14952 corp: 23/544b lim: 35 exec/s: 31 rss: 74Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:08:39.906 [2024-12-05 12:50:42.991361] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:42.991386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:42.991445] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006d9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:42.991459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.906 #32 NEW cov: 12368 ft: 15139 corp: 24/559b lim: 35 exec/s: 32 rss: 74Mb L: 15/33 MS: 1 EraseBytes- 00:08:39.906 [2024-12-05 12:50:43.051643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.051669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.051728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.051742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.051813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.051827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.906 #33 NEW cov: 12368 ft: 15175 corp: 25/582b lim: 35 exec/s: 33 rss: 74Mb L: 23/33 MS: 1 ChangeByte- 00:08:39.906 [2024-12-05 12:50:43.111776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.111802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.111877] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.111893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.111951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.111965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.906 #34 NEW cov: 12368 ft: 15252 corp: 26/605b lim: 35 exec/s: 34 rss: 74Mb L: 23/33 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:39.906 [2024-12-05 12:50:43.152055] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.152081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.152140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.152154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.152210] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.152223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.152281] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.152294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:39.906 #35 NEW cov: 12368 ft: 15274 corp: 27/639b lim: 35 exec/s: 35 rss: 74Mb L: 34/34 MS: 1 InsertByte- 00:08:39.906 [2024-12-05 12:50:43.192031] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.192056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.192115] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.192129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:39.906 [2024-12-05 12:50:43.192185] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000001f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:39.906 [2024-12-05 12:50:43.192198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:39.906 #36 NEW cov: 12368 ft: 15292 corp: 28/664b lim: 35 exec/s: 36 rss: 74Mb L: 25/34 MS: 1 InsertByte- 00:08:40.168 [2024-12-05 12:50:43.232129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.232154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.232231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.232245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.232302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.232315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.168 #37 NEW cov: 12368 ft: 15328 corp: 29/687b lim: 35 exec/s: 37 rss: 74Mb L: 23/34 MS: 1 ChangeBinInt- 00:08:40.168 [2024-12-05 12:50:43.272255] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.272279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.272355] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000248 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.272369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.272426] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.272439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.168 #38 NEW cov: 12368 ft: 15376 corp: 30/712b lim: 35 exec/s: 38 rss: 74Mb L: 25/34 MS: 1 ChangeBit- 00:08:40.168 [2024-12-05 12:50:43.312396] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.312420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.312481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.312498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.168 NEW_FUNC[1/2]: 0x4871b8 in feat_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:332 00:08:40.168 NEW_FUNC[2/2]: 0x1399218 in nvmf_ctrlr_get_features_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1709 00:08:40.168 #39 NEW cov: 12417 ft: 15444 corp: 31/735b lim: 35 exec/s: 39 rss: 74Mb L: 23/34 MS: 1 ChangeBinInt- 00:08:40.168 [2024-12-05 12:50:43.382546] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000133 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.382573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.382631] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.382645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.382703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.382716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.168 #40 NEW cov: 12417 ft: 15458 corp: 32/761b lim: 35 exec/s: 40 rss: 74Mb L: 26/34 MS: 1 InsertRepeatedBytes- 00:08:40.168 [2024-12-05 12:50:43.422787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.422812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.422891] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.422906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.422965] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.422977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.423037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.423051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.168 #41 NEW cov: 12417 ft: 15494 corp: 33/795b lim: 35 exec/s: 41 rss: 74Mb L: 34/34 MS: 1 CopyPart- 00:08:40.168 [2024-12-05 12:50:43.462874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.462898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.462959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.462972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.463032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.463045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.168 [2024-12-05 12:50:43.463107] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000004ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.168 [2024-12-05 12:50:43.463120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.430 #42 NEW cov: 12417 ft: 15503 corp: 34/829b lim: 35 exec/s: 42 rss: 74Mb L: 34/34 MS: 1 InsertByte- 00:08:40.430 [2024-12-05 12:50:43.523060] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.430 [2024-12-05 12:50:43.523084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.430 [2024-12-05 12:50:43.523159] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.430 [2024-12-05 12:50:43.523173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.430 [2024-12-05 12:50:43.523230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.430 [2024-12-05 12:50:43.523244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.430 [2024-12-05 12:50:43.523301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000006d9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.430 [2024-12-05 12:50:43.523315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.430 #43 NEW cov: 12417 ft: 15505 corp: 35/857b lim: 35 exec/s: 21 rss: 74Mb L: 28/34 MS: 1 ChangeBit- 00:08:40.430 #43 DONE cov: 12417 ft: 15505 corp: 35/857b lim: 35 exec/s: 21 rss: 74Mb 00:08:40.430 ###### Recommended dictionary. ###### 00:08:40.430 "\377\377\377\377" # Uses: 0 00:08:40.430 ###### End of recommended dictionary. ###### 00:08:40.430 Done 43 runs in 2 second(s) 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:40.430 12:50:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:40.430 [2024-12-05 12:50:43.687647] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:40.430 [2024-12-05 12:50:43.687733] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid155715 ] 00:08:40.690 [2024-12-05 12:50:43.886486] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.690 [2024-12-05 12:50:43.899216] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.690 [2024-12-05 12:50:43.951547] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:40.690 [2024-12-05 12:50:43.967879] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:40.690 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.690 INFO: Seed: 1081797330 00:08:40.951 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:40.951 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:40.951 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:40.951 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.951 #2 INITED exec/s: 0 rss: 64Mb 00:08:40.951 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.951 This may also happen if the target rejected all inputs we tried so far 00:08:40.951 [2024-12-05 12:50:44.033180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.951 [2024-12-05 12:50:44.033211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.951 [2024-12-05 12:50:44.033265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.951 [2024-12-05 12:50:44.033280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.211 NEW_FUNC[1/716]: 0x4692e8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:41.211 NEW_FUNC[2/716]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:41.211 #5 NEW cov: 12209 ft: 12210 corp: 2/61b lim: 105 exec/s: 0 rss: 72Mb L: 60/60 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:41.211 [2024-12-05 12:50:44.364187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.211 [2024-12-05 12:50:44.364245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.211 [2024-12-05 12:50:44.364341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.211 [2024-12-05 12:50:44.364371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.211 NEW_FUNC[1/1]: 0x1a17168 in nvme_tcp_qpair /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:183 00:08:41.211 #6 NEW cov: 12343 ft: 12946 corp: 3/122b lim: 105 exec/s: 0 rss: 72Mb L: 61/61 MS: 1 InsertByte- 00:08:41.211 [2024-12-05 12:50:44.434109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.211 [2024-12-05 12:50:44.434137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.211 [2024-12-05 12:50:44.434191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.211 [2024-12-05 12:50:44.434210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.211 #7 NEW cov: 12349 ft: 13116 corp: 4/183b lim: 105 exec/s: 0 rss: 72Mb L: 61/61 MS: 1 CrossOver- 00:08:41.211 [2024-12-05 12:50:44.494261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.211 [2024-12-05 12:50:44.494289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.211 [2024-12-05 12:50:44.494342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.211 [2024-12-05 12:50:44.494356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.211 #8 NEW cov: 12434 ft: 13409 corp: 5/244b lim: 105 exec/s: 0 rss: 72Mb L: 61/61 MS: 1 InsertByte- 00:08:41.472 [2024-12-05 12:50:44.534492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.534518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.472 [2024-12-05 12:50:44.534564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.534579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.472 [2024-12-05 12:50:44.534633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3539992576 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.534648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.472 #9 NEW cov: 12434 ft: 13939 corp: 6/312b lim: 105 exec/s: 0 rss: 72Mb L: 68/68 MS: 1 CopyPart- 00:08:41.472 [2024-12-05 12:50:44.594548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.594576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.472 [2024-12-05 12:50:44.594617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:768 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.594633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.472 #10 NEW cov: 12434 ft: 14026 corp: 7/373b lim: 105 exec/s: 0 rss: 72Mb L: 61/68 MS: 1 ChangeBinInt- 00:08:41.472 [2024-12-05 12:50:44.654587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772170 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.654614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.472 #11 NEW cov: 12434 ft: 14490 corp: 8/404b lim: 105 exec/s: 0 rss: 72Mb L: 31/68 MS: 1 CrossOver- 00:08:41.472 [2024-12-05 12:50:44.694795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.694822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.472 [2024-12-05 12:50:44.694862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.694878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.472 #12 NEW cov: 12434 ft: 14538 corp: 9/464b lim: 105 exec/s: 0 rss: 72Mb L: 60/68 MS: 1 ShuffleBytes- 00:08:41.472 [2024-12-05 12:50:44.734863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.734890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.472 #16 NEW cov: 12434 ft: 14603 corp: 10/492b lim: 105 exec/s: 0 rss: 72Mb L: 28/68 MS: 4 ChangeByte-ChangeByte-ChangeBinInt-CrossOver- 00:08:41.472 [2024-12-05 12:50:44.775045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.775071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.472 [2024-12-05 12:50:44.775107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.472 [2024-12-05 12:50:44.775123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.733 #17 NEW cov: 12434 ft: 14644 corp: 11/552b lim: 105 exec/s: 0 rss: 73Mb L: 60/68 MS: 1 ChangeBinInt- 00:08:41.733 [2024-12-05 12:50:44.835246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.733 [2024-12-05 12:50:44.835272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.733 [2024-12-05 12:50:44.835309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.733 [2024-12-05 12:50:44.835326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.733 #18 NEW cov: 12434 ft: 14669 corp: 12/613b lim: 105 exec/s: 0 rss: 73Mb L: 61/68 MS: 1 ShuffleBytes- 00:08:41.733 [2024-12-05 12:50:44.875318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.733 [2024-12-05 12:50:44.875344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.733 [2024-12-05 12:50:44.875385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.733 [2024-12-05 12:50:44.875401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.733 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:41.733 #24 NEW cov: 12457 ft: 14752 corp: 13/655b lim: 105 exec/s: 0 rss: 73Mb L: 42/68 MS: 1 EraseBytes- 00:08:41.733 [2024-12-05 12:50:44.935511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.733 [2024-12-05 12:50:44.935537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.733 [2024-12-05 12:50:44.935574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.733 [2024-12-05 12:50:44.935588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.733 #25 NEW cov: 12457 ft: 14786 corp: 14/715b lim: 105 exec/s: 0 rss: 73Mb L: 60/68 MS: 1 ChangeBit- 00:08:41.733 [2024-12-05 12:50:44.995675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.733 [2024-12-05 12:50:44.995700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.733 [2024-12-05 12:50:44.995738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.733 [2024-12-05 12:50:44.995757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.733 #26 NEW cov: 12457 ft: 14791 corp: 15/757b lim: 105 exec/s: 26 rss: 73Mb L: 42/68 MS: 1 CrossOver- 00:08:41.993 [2024-12-05 12:50:45.055842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.055869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.993 [2024-12-05 12:50:45.055904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.055919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.993 #27 NEW cov: 12457 ft: 14813 corp: 16/818b lim: 105 exec/s: 27 rss: 73Mb L: 61/68 MS: 1 ShuffleBytes- 00:08:41.993 [2024-12-05 12:50:45.095961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.095987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.993 [2024-12-05 12:50:45.096030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.096044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.993 #28 NEW cov: 12457 ft: 14837 corp: 17/879b lim: 105 exec/s: 28 rss: 73Mb L: 61/68 MS: 1 ChangeBinInt- 00:08:41.993 [2024-12-05 12:50:45.156103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.156129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.993 [2024-12-05 12:50:45.156165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:768 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.156180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.993 #29 NEW cov: 12457 ft: 14847 corp: 18/940b lim: 105 exec/s: 29 rss: 73Mb L: 61/68 MS: 1 ChangeByte- 00:08:41.993 [2024-12-05 12:50:45.196233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772170 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.196262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.993 [2024-12-05 12:50:45.196317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10995116277760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.196333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.993 #30 NEW cov: 12457 ft: 14851 corp: 19/1001b lim: 105 exec/s: 30 rss: 73Mb L: 61/68 MS: 1 CopyPart- 00:08:41.993 [2024-12-05 12:50:45.256398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.256424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.993 [2024-12-05 12:50:45.256463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.993 [2024-12-05 12:50:45.256479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.993 #31 NEW cov: 12457 ft: 14857 corp: 20/1063b lim: 105 exec/s: 31 rss: 73Mb L: 62/68 MS: 1 InsertByte- 00:08:42.253 [2024-12-05 12:50:45.316560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:541333651456 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.316586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.253 [2024-12-05 12:50:45.316622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.316638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.253 #32 NEW cov: 12457 ft: 14861 corp: 21/1124b lim: 105 exec/s: 32 rss: 73Mb L: 61/68 MS: 1 ChangeByte- 00:08:42.253 [2024-12-05 12:50:45.356662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.356688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.253 [2024-12-05 12:50:45.356724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.356739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.253 #33 NEW cov: 12457 ft: 14919 corp: 22/1184b lim: 105 exec/s: 33 rss: 73Mb L: 60/68 MS: 1 ChangeByte- 00:08:42.253 [2024-12-05 12:50:45.396674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.396701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.253 #34 NEW cov: 12457 ft: 14939 corp: 23/1222b lim: 105 exec/s: 34 rss: 73Mb L: 38/68 MS: 1 EraseBytes- 00:08:42.253 [2024-12-05 12:50:45.436783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.436809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.253 #38 NEW cov: 12457 ft: 14956 corp: 24/1260b lim: 105 exec/s: 38 rss: 73Mb L: 38/68 MS: 4 ChangeBit-ShuffleBytes-ChangeBit-CrossOver- 00:08:42.253 [2024-12-05 12:50:45.477167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.477194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.253 [2024-12-05 12:50:45.477243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:720575953264181248 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.477259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.253 [2024-12-05 12:50:45.477313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.477328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.253 #39 NEW cov: 12457 ft: 14996 corp: 25/1326b lim: 105 exec/s: 39 rss: 73Mb L: 66/68 MS: 1 CopyPart- 00:08:42.253 [2024-12-05 12:50:45.537395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.537421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.253 [2024-12-05 12:50:45.537490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.537510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.253 [2024-12-05 12:50:45.537566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.537580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.253 [2024-12-05 12:50:45.537633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.253 [2024-12-05 12:50:45.537649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.253 #40 NEW cov: 12457 ft: 15523 corp: 26/1425b lim: 105 exec/s: 40 rss: 73Mb L: 99/99 MS: 1 CrossOver- 00:08:42.513 [2024-12-05 12:50:45.577283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.513 [2024-12-05 12:50:45.577310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.513 [2024-12-05 12:50:45.577347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:33 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.513 [2024-12-05 12:50:45.577362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.513 #41 NEW cov: 12457 ft: 15542 corp: 27/1487b lim: 105 exec/s: 41 rss: 73Mb L: 62/99 MS: 1 ChangeBit- 00:08:42.513 [2024-12-05 12:50:45.637449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.513 [2024-12-05 12:50:45.637475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.514 [2024-12-05 12:50:45.637528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.637544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.514 #42 NEW cov: 12457 ft: 15565 corp: 28/1547b lim: 105 exec/s: 42 rss: 73Mb L: 60/99 MS: 1 ShuffleBytes- 00:08:42.514 [2024-12-05 12:50:45.677582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772170 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.677610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.514 [2024-12-05 12:50:45.677648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10995116277760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.677663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.514 #43 NEW cov: 12457 ft: 15574 corp: 29/1608b lim: 105 exec/s: 43 rss: 73Mb L: 61/99 MS: 1 ChangeBit- 00:08:42.514 [2024-12-05 12:50:45.737984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.738011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.514 [2024-12-05 12:50:45.738090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.738105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.514 [2024-12-05 12:50:45.738175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.738192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.514 [2024-12-05 12:50:45.738253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:4278190080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.738269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.514 #44 NEW cov: 12457 ft: 15611 corp: 30/1705b lim: 105 exec/s: 44 rss: 73Mb L: 97/99 MS: 1 InsertRepeatedBytes- 00:08:42.514 [2024-12-05 12:50:45.777737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.777764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.514 #45 NEW cov: 12457 ft: 15623 corp: 31/1744b lim: 105 exec/s: 45 rss: 73Mb L: 39/99 MS: 1 EraseBytes- 00:08:42.514 [2024-12-05 12:50:45.818017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772170 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.818043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.514 [2024-12-05 12:50:45.818081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10995116277760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.514 [2024-12-05 12:50:45.818096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.774 #46 NEW cov: 12457 ft: 15636 corp: 32/1805b lim: 105 exec/s: 46 rss: 73Mb L: 61/99 MS: 1 ChangeBinInt- 00:08:42.774 [2024-12-05 12:50:45.878105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:45.878131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.774 [2024-12-05 12:50:45.878169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:45.878185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.774 #47 NEW cov: 12457 ft: 15644 corp: 33/1865b lim: 105 exec/s: 47 rss: 73Mb L: 60/99 MS: 1 ChangeBinInt- 00:08:42.774 [2024-12-05 12:50:45.918500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:45.918526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.774 [2024-12-05 12:50:45.918575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:45.918590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.774 [2024-12-05 12:50:45.918647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:45.918662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.774 [2024-12-05 12:50:45.918715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:45.918730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.774 #48 NEW cov: 12457 ft: 15646 corp: 34/1954b lim: 105 exec/s: 48 rss: 73Mb L: 89/99 MS: 1 CopyPart- 00:08:42.774 [2024-12-05 12:50:45.958365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:45.958396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.774 [2024-12-05 12:50:45.958439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:65280 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:45.958453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.774 #49 NEW cov: 12457 ft: 15660 corp: 35/2014b lim: 105 exec/s: 49 rss: 73Mb L: 60/99 MS: 1 ChangeByte- 00:08:42.774 [2024-12-05 12:50:46.018532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:46.018558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.774 [2024-12-05 12:50:46.018595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.774 [2024-12-05 12:50:46.018611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.774 #50 NEW cov: 12457 ft: 15664 corp: 36/2074b lim: 105 exec/s: 25 rss: 73Mb L: 60/99 MS: 1 ChangeBinInt- 00:08:42.774 #50 DONE cov: 12457 ft: 15664 corp: 36/2074b lim: 105 exec/s: 25 rss: 73Mb 00:08:42.774 Done 50 runs in 2 second(s) 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:43.035 12:50:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:43.035 [2024-12-05 12:50:46.202861] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:43.035 [2024-12-05 12:50:46.202939] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156243 ] 00:08:43.295 [2024-12-05 12:50:46.402804] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.295 [2024-12-05 12:50:46.415222] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.295 [2024-12-05 12:50:46.467729] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:43.295 [2024-12-05 12:50:46.484061] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:43.295 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.295 INFO: Seed: 3597790701 00:08:43.295 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:43.295 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:43.295 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:43.295 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.295 #2 INITED exec/s: 0 rss: 64Mb 00:08:43.295 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.295 This may also happen if the target rejected all inputs we tried so far 00:08:43.295 [2024-12-05 12:50:46.560614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.295 [2024-12-05 12:50:46.560656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.295 [2024-12-05 12:50:46.560778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.295 [2024-12-05 12:50:46.560803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.295 [2024-12-05 12:50:46.560925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.295 [2024-12-05 12:50:46.560947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.295 [2024-12-05 12:50:46.561065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.295 [2024-12-05 12:50:46.561088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.555 NEW_FUNC[1/718]: 0x46c668 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:43.555 NEW_FUNC[2/718]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.555 #12 NEW cov: 12234 ft: 12235 corp: 2/100b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 5 ShuffleBytes-InsertByte-CrossOver-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:43.814 [2024-12-05 12:50:46.891484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:46.891519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:46.891645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:46.891663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:46.891782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:46.891802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:46.891927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:46.891953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.814 #18 NEW cov: 12364 ft: 12868 corp: 3/199b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 CrossOver- 00:08:43.814 [2024-12-05 12:50:46.961686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:46.961717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:46.961818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:46.961844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:46.961967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:46.961990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:46.962106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:46.962128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.814 #24 NEW cov: 12370 ft: 13111 corp: 4/298b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 CopyPart- 00:08:43.814 [2024-12-05 12:50:47.001666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.001697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.001791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.001811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.001931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.001955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.002073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.002094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.814 #25 NEW cov: 12455 ft: 13405 corp: 5/397b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 ChangeBinInt- 00:08:43.814 [2024-12-05 12:50:47.041861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.041897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.041981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.042004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.042123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.042146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.042268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.042291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.814 #26 NEW cov: 12455 ft: 13463 corp: 6/496b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 ChangeByte- 00:08:43.814 [2024-12-05 12:50:47.101967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.101998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.102086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.102107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.102226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.102249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.814 [2024-12-05 12:50:47.102373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:43.814 [2024-12-05 12:50:47.102395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.815 #27 NEW cov: 12455 ft: 13497 corp: 7/595b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 CopyPart- 00:08:44.074 [2024-12-05 12:50:47.142211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65287 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.142241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.142337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.142370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.142493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.142514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.142634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.142656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.074 #28 NEW cov: 12455 ft: 13568 corp: 8/694b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 ChangeBinInt- 00:08:44.074 [2024-12-05 12:50:47.212423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744071746617343 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.212457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.212542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.212565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.212681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.212704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.212825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.212847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.074 #29 NEW cov: 12455 ft: 13683 corp: 9/793b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 ChangeBit- 00:08:44.074 [2024-12-05 12:50:47.262605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709488895 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.262640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.262733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.262757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.262887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.262914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.074 [2024-12-05 12:50:47.263031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.263055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.074 #30 NEW cov: 12455 ft: 13721 corp: 10/892b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 ShuffleBytes- 00:08:44.074 [2024-12-05 12:50:47.332741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65287 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.074 [2024-12-05 12:50:47.332779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.075 [2024-12-05 12:50:47.332866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.075 [2024-12-05 12:50:47.332890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.075 [2024-12-05 12:50:47.333011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.075 [2024-12-05 12:50:47.333035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.075 [2024-12-05 12:50:47.333148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.075 [2024-12-05 12:50:47.333170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.075 #31 NEW cov: 12455 ft: 13796 corp: 11/991b lim: 120 exec/s: 0 rss: 72Mb L: 99/99 MS: 1 ShuffleBytes- 00:08:44.334 [2024-12-05 12:50:47.402460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:184549375 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.334 [2024-12-05 12:50:47.402493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.334 [2024-12-05 12:50:47.402614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.334 [2024-12-05 12:50:47.402636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.334 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:44.334 #32 NEW cov: 12478 ft: 14273 corp: 12/1045b lim: 120 exec/s: 0 rss: 73Mb L: 54/99 MS: 1 EraseBytes- 00:08:44.334 [2024-12-05 12:50:47.473138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65287 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.334 [2024-12-05 12:50:47.473170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.334 [2024-12-05 12:50:47.473249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.334 [2024-12-05 12:50:47.473271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.334 [2024-12-05 12:50:47.473386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18374686479688007679 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.334 [2024-12-05 12:50:47.473407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.334 [2024-12-05 12:50:47.473526] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.334 [2024-12-05 12:50:47.473547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.334 #33 NEW cov: 12478 ft: 14304 corp: 13/1144b lim: 120 exec/s: 0 rss: 73Mb L: 99/99 MS: 1 ChangeBinInt- 00:08:44.334 [2024-12-05 12:50:47.523356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.334 [2024-12-05 12:50:47.523385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.334 [2024-12-05 12:50:47.523452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.334 [2024-12-05 12:50:47.523472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.335 [2024-12-05 12:50:47.523589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.523611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.335 [2024-12-05 12:50:47.523733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.523758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.335 #34 NEW cov: 12478 ft: 14332 corp: 14/1243b lim: 120 exec/s: 34 rss: 73Mb L: 99/99 MS: 1 ChangeBinInt- 00:08:44.335 [2024-12-05 12:50:47.573310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.573343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.335 [2024-12-05 12:50:47.573452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.573473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.335 [2024-12-05 12:50:47.573597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.573618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.335 #35 NEW cov: 12478 ft: 14622 corp: 15/1338b lim: 120 exec/s: 35 rss: 73Mb L: 95/99 MS: 1 EraseBytes- 00:08:44.335 [2024-12-05 12:50:47.623743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.623777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.335 [2024-12-05 12:50:47.623864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.623889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.335 [2024-12-05 12:50:47.624010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16640 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.624035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.335 [2024-12-05 12:50:47.624155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.335 [2024-12-05 12:50:47.624176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.335 #36 NEW cov: 12478 ft: 14645 corp: 16/1438b lim: 120 exec/s: 36 rss: 73Mb L: 100/100 MS: 1 InsertByte- 00:08:44.594 [2024-12-05 12:50:47.673857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.673888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.594 [2024-12-05 12:50:47.673964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.673986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.594 [2024-12-05 12:50:47.674106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16640 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.674127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.594 [2024-12-05 12:50:47.674251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4611686018427387904 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.674272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.594 #37 NEW cov: 12478 ft: 14662 corp: 17/1538b lim: 120 exec/s: 37 rss: 73Mb L: 100/100 MS: 1 ChangeBit- 00:08:44.594 [2024-12-05 12:50:47.744003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744071746617343 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.744033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.594 [2024-12-05 12:50:47.744126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.744146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.594 [2024-12-05 12:50:47.744262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.744286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.594 [2024-12-05 12:50:47.744407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.744434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.594 #38 NEW cov: 12478 ft: 14718 corp: 18/1637b lim: 120 exec/s: 38 rss: 73Mb L: 99/100 MS: 1 ShuffleBytes- 00:08:44.594 [2024-12-05 12:50:47.814220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744071746617343 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.594 [2024-12-05 12:50:47.814254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.595 [2024-12-05 12:50:47.814362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.595 [2024-12-05 12:50:47.814386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.595 [2024-12-05 12:50:47.814501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.595 [2024-12-05 12:50:47.814522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.595 [2024-12-05 12:50:47.814639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1086626725888 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.595 [2024-12-05 12:50:47.814661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.595 #39 NEW cov: 12478 ft: 14723 corp: 19/1736b lim: 120 exec/s: 39 rss: 73Mb L: 99/100 MS: 1 ChangeBinInt- 00:08:44.595 [2024-12-05 12:50:47.873643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.595 [2024-12-05 12:50:47.873674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.595 #40 NEW cov: 12478 ft: 15516 corp: 20/1775b lim: 120 exec/s: 40 rss: 73Mb L: 39/100 MS: 1 CrossOver- 00:08:44.855 [2024-12-05 12:50:47.924581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:47.924613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:47.924727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:11009 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:47.924759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:47.924873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:47.924895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:47.925012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:47.925036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.855 #41 NEW cov: 12478 ft: 15521 corp: 21/1875b lim: 120 exec/s: 41 rss: 73Mb L: 100/100 MS: 1 InsertByte- 00:08:44.855 [2024-12-05 12:50:47.984794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:47.984823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:47.984892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:47.984917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:47.985043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:47.985067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:47.985188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:47.985213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.855 #42 NEW cov: 12478 ft: 15547 corp: 22/1974b lim: 120 exec/s: 42 rss: 73Mb L: 99/100 MS: 1 CopyPart- 00:08:44.855 [2024-12-05 12:50:48.024825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744071746617343 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.024858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.024938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.024958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.025071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.025093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.025209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.025232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.855 #43 NEW cov: 12478 ft: 15568 corp: 23/2073b lim: 120 exec/s: 43 rss: 73Mb L: 99/100 MS: 1 ChangeByte- 00:08:44.855 [2024-12-05 12:50:48.064141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.064171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.855 #44 NEW cov: 12478 ft: 15586 corp: 24/2106b lim: 120 exec/s: 44 rss: 73Mb L: 33/100 MS: 1 CrossOver- 00:08:44.855 [2024-12-05 12:50:48.115132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744071746617343 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.115164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.115258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.115278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.115386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.115405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.115528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.115551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.855 #45 NEW cov: 12478 ft: 15602 corp: 25/2205b lim: 120 exec/s: 45 rss: 73Mb L: 99/100 MS: 1 ShuffleBytes- 00:08:44.855 [2024-12-05 12:50:48.155071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.155101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.155181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.155206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.155319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.155342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.855 [2024-12-05 12:50:48.155460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:44.855 [2024-12-05 12:50:48.155485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.115 #46 NEW cov: 12478 ft: 15604 corp: 26/2309b lim: 120 exec/s: 46 rss: 73Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:45.115 [2024-12-05 12:50:48.225377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.225406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.115 [2024-12-05 12:50:48.225483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.225504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.115 [2024-12-05 12:50:48.225627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.225649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.115 [2024-12-05 12:50:48.225772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.225795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.115 #47 NEW cov: 12478 ft: 15662 corp: 27/2408b lim: 120 exec/s: 47 rss: 73Mb L: 99/104 MS: 1 CMP- DE: "H\000\000\000\000\000\000\000"- 00:08:45.115 [2024-12-05 12:50:48.275525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.275556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.115 [2024-12-05 12:50:48.275651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:140737488355328 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.275671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.115 [2024-12-05 12:50:48.275796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.275818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.115 [2024-12-05 12:50:48.275947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.275974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.115 #48 NEW cov: 12478 ft: 15690 corp: 28/2507b lim: 120 exec/s: 48 rss: 73Mb L: 99/104 MS: 1 ChangeBit- 00:08:45.115 [2024-12-05 12:50:48.325702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.325732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.115 [2024-12-05 12:50:48.325840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.115 [2024-12-05 12:50:48.325863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.116 [2024-12-05 12:50:48.325986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.116 [2024-12-05 12:50:48.326011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.116 [2024-12-05 12:50:48.326138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.116 [2024-12-05 12:50:48.326161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.116 #49 NEW cov: 12478 ft: 15712 corp: 29/2607b lim: 120 exec/s: 49 rss: 73Mb L: 100/104 MS: 1 InsertByte- 00:08:45.116 [2024-12-05 12:50:48.366032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744071746617343 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.116 [2024-12-05 12:50:48.366063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.116 [2024-12-05 12:50:48.366141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.116 [2024-12-05 12:50:48.366163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.116 [2024-12-05 12:50:48.366282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.116 [2024-12-05 12:50:48.366303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.116 [2024-12-05 12:50:48.366426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.116 [2024-12-05 12:50:48.366448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.116 [2024-12-05 12:50:48.366572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.116 [2024-12-05 12:50:48.366593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.116 #50 NEW cov: 12478 ft: 15759 corp: 30/2727b lim: 120 exec/s: 50 rss: 73Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:08:45.376 [2024-12-05 12:50:48.435767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744071746617343 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.376 [2024-12-05 12:50:48.435799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.376 [2024-12-05 12:50:48.435914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.376 [2024-12-05 12:50:48.435941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.377 [2024-12-05 12:50:48.436066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.377 [2024-12-05 12:50:48.436089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.377 #51 NEW cov: 12478 ft: 15767 corp: 31/2809b lim: 120 exec/s: 51 rss: 73Mb L: 82/120 MS: 1 CrossOver- 00:08:45.377 [2024-12-05 12:50:48.485329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.377 [2024-12-05 12:50:48.485361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.377 #52 NEW cov: 12478 ft: 15769 corp: 32/2842b lim: 120 exec/s: 52 rss: 73Mb L: 33/120 MS: 1 ShuffleBytes- 00:08:45.377 [2024-12-05 12:50:48.545827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65287 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.377 [2024-12-05 12:50:48.545861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.377 [2024-12-05 12:50:48.545968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:45.377 [2024-12-05 12:50:48.545991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.377 #53 NEW cov: 12478 ft: 15785 corp: 33/2900b lim: 120 exec/s: 26 rss: 73Mb L: 58/120 MS: 1 EraseBytes- 00:08:45.377 #53 DONE cov: 12478 ft: 15785 corp: 33/2900b lim: 120 exec/s: 26 rss: 73Mb 00:08:45.377 ###### Recommended dictionary. ###### 00:08:45.377 "H\000\000\000\000\000\000\000" # Uses: 0 00:08:45.377 ###### End of recommended dictionary. ###### 00:08:45.377 Done 53 runs in 2 second(s) 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:45.377 12:50:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:45.638 [2024-12-05 12:50:48.714500] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:45.638 [2024-12-05 12:50:48.714565] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid156537 ] 00:08:45.638 [2024-12-05 12:50:48.912920] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.638 [2024-12-05 12:50:48.926335] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.899 [2024-12-05 12:50:48.979123] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.899 [2024-12-05 12:50:48.995452] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:45.899 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.899 INFO: Seed: 1814815227 00:08:45.899 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:45.899 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:45.899 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:45.899 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.899 #2 INITED exec/s: 0 rss: 64Mb 00:08:45.899 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.899 This may also happen if the target rejected all inputs we tried so far 00:08:45.899 [2024-12-05 12:50:49.071360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:45.899 [2024-12-05 12:50:49.071399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.170 NEW_FUNC[1/714]: 0x46ff58 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:46.170 NEW_FUNC[2/714]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:46.170 #5 NEW cov: 12169 ft: 12168 corp: 2/34b lim: 100 exec/s: 0 rss: 72Mb L: 33/33 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:08:46.170 [2024-12-05 12:50:49.412332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.170 [2024-12-05 12:50:49.412380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.170 NEW_FUNC[1/2]: 0x1fbadd8 in thread_update_stats /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:943 00:08:46.170 NEW_FUNC[2/2]: 0x1fbccd8 in spdk_thread_get_last_tsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1339 00:08:46.170 #11 NEW cov: 12307 ft: 12847 corp: 3/72b lim: 100 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:46.170 [2024-12-05 12:50:49.452399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.170 [2024-12-05 12:50:49.452430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.170 [2024-12-05 12:50:49.452543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.170 [2024-12-05 12:50:49.452562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.434 #12 NEW cov: 12313 ft: 13298 corp: 4/127b lim: 100 exec/s: 0 rss: 72Mb L: 55/55 MS: 1 InsertRepeatedBytes- 00:08:46.434 [2024-12-05 12:50:49.522472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.434 [2024-12-05 12:50:49.522506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.434 #13 NEW cov: 12398 ft: 13551 corp: 5/165b lim: 100 exec/s: 0 rss: 72Mb L: 38/55 MS: 1 CrossOver- 00:08:46.434 [2024-12-05 12:50:49.582881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.434 [2024-12-05 12:50:49.582914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.434 [2024-12-05 12:50:49.583027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.434 [2024-12-05 12:50:49.583047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.434 [2024-12-05 12:50:49.583158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.434 [2024-12-05 12:50:49.583181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.434 #16 NEW cov: 12398 ft: 13941 corp: 6/231b lim: 100 exec/s: 0 rss: 72Mb L: 66/66 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:08:46.434 [2024-12-05 12:50:49.622964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.434 [2024-12-05 12:50:49.622994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.434 [2024-12-05 12:50:49.623111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.434 [2024-12-05 12:50:49.623130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.434 [2024-12-05 12:50:49.623244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.434 [2024-12-05 12:50:49.623267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.434 #17 NEW cov: 12398 ft: 14062 corp: 7/297b lim: 100 exec/s: 0 rss: 72Mb L: 66/66 MS: 1 ChangeBinInt- 00:08:46.434 [2024-12-05 12:50:49.682865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.434 [2024-12-05 12:50:49.682891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.434 #18 NEW cov: 12398 ft: 14105 corp: 8/335b lim: 100 exec/s: 0 rss: 72Mb L: 38/66 MS: 1 ChangeBinInt- 00:08:46.434 [2024-12-05 12:50:49.743027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.434 [2024-12-05 12:50:49.743058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.700 #19 NEW cov: 12398 ft: 14188 corp: 9/373b lim: 100 exec/s: 0 rss: 72Mb L: 38/66 MS: 1 ChangeBinInt- 00:08:46.700 [2024-12-05 12:50:49.783245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.700 [2024-12-05 12:50:49.783275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.700 [2024-12-05 12:50:49.783392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.700 [2024-12-05 12:50:49.783413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.700 #20 NEW cov: 12398 ft: 14298 corp: 10/428b lim: 100 exec/s: 0 rss: 73Mb L: 55/66 MS: 1 ChangeBinInt- 00:08:46.700 [2024-12-05 12:50:49.843576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.700 [2024-12-05 12:50:49.843604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.700 [2024-12-05 12:50:49.843667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.700 [2024-12-05 12:50:49.843686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.700 [2024-12-05 12:50:49.843800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.700 [2024-12-05 12:50:49.843823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.700 #21 NEW cov: 12398 ft: 14388 corp: 11/489b lim: 100 exec/s: 0 rss: 73Mb L: 61/66 MS: 1 EraseBytes- 00:08:46.700 [2024-12-05 12:50:49.893489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.700 [2024-12-05 12:50:49.893513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.700 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:46.700 #22 NEW cov: 12421 ft: 14420 corp: 12/527b lim: 100 exec/s: 0 rss: 73Mb L: 38/66 MS: 1 ChangeBinInt- 00:08:46.700 [2024-12-05 12:50:49.964068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.700 [2024-12-05 12:50:49.964097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.700 [2024-12-05 12:50:49.964215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.700 [2024-12-05 12:50:49.964236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.700 [2024-12-05 12:50:49.964358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.700 [2024-12-05 12:50:49.964377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.700 #23 NEW cov: 12421 ft: 14461 corp: 13/593b lim: 100 exec/s: 0 rss: 73Mb L: 66/66 MS: 1 ChangeByte- 00:08:46.971 [2024-12-05 12:50:50.014253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.971 [2024-12-05 12:50:50.014285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.971 [2024-12-05 12:50:50.014396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.971 [2024-12-05 12:50:50.014418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.971 [2024-12-05 12:50:50.014534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.971 [2024-12-05 12:50:50.014555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.971 #28 NEW cov: 12421 ft: 14540 corp: 14/660b lim: 100 exec/s: 0 rss: 73Mb L: 67/67 MS: 5 InsertByte-ChangeBit-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:08:46.971 [2024-12-05 12:50:50.064079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.971 [2024-12-05 12:50:50.064110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.971 #29 NEW cov: 12421 ft: 14576 corp: 15/698b lim: 100 exec/s: 29 rss: 73Mb L: 38/67 MS: 1 ChangeBinInt- 00:08:46.971 [2024-12-05 12:50:50.104570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.971 [2024-12-05 12:50:50.104599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.971 [2024-12-05 12:50:50.104717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.971 [2024-12-05 12:50:50.104738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.971 [2024-12-05 12:50:50.104854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.971 [2024-12-05 12:50:50.104875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.971 #30 NEW cov: 12421 ft: 14587 corp: 16/765b lim: 100 exec/s: 30 rss: 73Mb L: 67/67 MS: 1 InsertByte- 00:08:46.971 [2024-12-05 12:50:50.164748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.971 [2024-12-05 12:50:50.164776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.971 [2024-12-05 12:50:50.164858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.971 [2024-12-05 12:50:50.164876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.971 [2024-12-05 12:50:50.164990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:46.971 [2024-12-05 12:50:50.165010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.971 #36 NEW cov: 12421 ft: 14660 corp: 17/839b lim: 100 exec/s: 36 rss: 73Mb L: 74/74 MS: 1 CMP- DE: "\007\000\000\000\000\000\000\000"- 00:08:46.971 [2024-12-05 12:50:50.204507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.971 [2024-12-05 12:50:50.204535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.971 #37 NEW cov: 12421 ft: 14683 corp: 18/878b lim: 100 exec/s: 37 rss: 73Mb L: 39/74 MS: 1 InsertByte- 00:08:46.971 [2024-12-05 12:50:50.274850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:46.971 [2024-12-05 12:50:50.274881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.971 [2024-12-05 12:50:50.274994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:46.971 [2024-12-05 12:50:50.275017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.241 #38 NEW cov: 12421 ft: 14705 corp: 19/933b lim: 100 exec/s: 38 rss: 73Mb L: 55/74 MS: 1 CopyPart- 00:08:47.241 [2024-12-05 12:50:50.325207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.241 [2024-12-05 12:50:50.325235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.241 [2024-12-05 12:50:50.325320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.241 [2024-12-05 12:50:50.325339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.241 [2024-12-05 12:50:50.325454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.241 [2024-12-05 12:50:50.325478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.241 #39 NEW cov: 12421 ft: 14706 corp: 20/993b lim: 100 exec/s: 39 rss: 73Mb L: 60/74 MS: 1 EraseBytes- 00:08:47.241 [2024-12-05 12:50:50.395375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.241 [2024-12-05 12:50:50.395407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.241 [2024-12-05 12:50:50.395497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.241 [2024-12-05 12:50:50.395523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.241 [2024-12-05 12:50:50.395645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.241 [2024-12-05 12:50:50.395667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.241 #43 NEW cov: 12421 ft: 14722 corp: 21/1056b lim: 100 exec/s: 43 rss: 73Mb L: 63/74 MS: 4 CrossOver-ShuffleBytes-CMP-InsertRepeatedBytes- DE: "\024\325V\341"- 00:08:47.241 [2024-12-05 12:50:50.455227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.241 [2024-12-05 12:50:50.455256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.241 #44 NEW cov: 12421 ft: 14727 corp: 22/1089b lim: 100 exec/s: 44 rss: 73Mb L: 33/74 MS: 1 CMP- DE: "\377\001\000\000"- 00:08:47.241 [2024-12-05 12:50:50.505439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.241 [2024-12-05 12:50:50.505468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.241 #45 NEW cov: 12421 ft: 14731 corp: 23/1128b lim: 100 exec/s: 45 rss: 73Mb L: 39/74 MS: 1 ChangeBit- 00:08:47.507 [2024-12-05 12:50:50.565611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.507 [2024-12-05 12:50:50.565635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.507 #46 NEW cov: 12421 ft: 14746 corp: 24/1167b lim: 100 exec/s: 46 rss: 73Mb L: 39/74 MS: 1 ShuffleBytes- 00:08:47.507 [2024-12-05 12:50:50.635808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.507 [2024-12-05 12:50:50.635839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.507 #47 NEW cov: 12421 ft: 14775 corp: 25/1206b lim: 100 exec/s: 47 rss: 73Mb L: 39/74 MS: 1 EraseBytes- 00:08:47.507 [2024-12-05 12:50:50.695933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.507 [2024-12-05 12:50:50.695961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.507 #48 NEW cov: 12421 ft: 14787 corp: 26/1244b lim: 100 exec/s: 48 rss: 73Mb L: 38/74 MS: 1 ChangeByte- 00:08:47.507 [2024-12-05 12:50:50.735999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.507 [2024-12-05 12:50:50.736029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.507 #49 NEW cov: 12421 ft: 14796 corp: 27/1283b lim: 100 exec/s: 49 rss: 73Mb L: 39/74 MS: 1 InsertByte- 00:08:47.507 [2024-12-05 12:50:50.776476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.507 [2024-12-05 12:50:50.776506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.507 [2024-12-05 12:50:50.776609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.507 [2024-12-05 12:50:50.776633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.507 [2024-12-05 12:50:50.776740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.507 [2024-12-05 12:50:50.776763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.507 #50 NEW cov: 12421 ft: 14861 corp: 28/1353b lim: 100 exec/s: 50 rss: 73Mb L: 70/74 MS: 1 PersAutoDict- DE: "\377\001\000\000"- 00:08:47.782 [2024-12-05 12:50:50.826467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.782 [2024-12-05 12:50:50.826496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.782 [2024-12-05 12:50:50.826570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.782 [2024-12-05 12:50:50.826594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.782 [2024-12-05 12:50:50.826713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.782 [2024-12-05 12:50:50.826734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.782 #51 NEW cov: 12421 ft: 14877 corp: 29/1416b lim: 100 exec/s: 51 rss: 74Mb L: 63/74 MS: 1 CrossOver- 00:08:47.782 [2024-12-05 12:50:50.886721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.782 [2024-12-05 12:50:50.886750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.782 [2024-12-05 12:50:50.886859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.782 [2024-12-05 12:50:50.886880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.782 [2024-12-05 12:50:50.886994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.782 [2024-12-05 12:50:50.887014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.782 #52 NEW cov: 12421 ft: 14893 corp: 30/1483b lim: 100 exec/s: 52 rss: 74Mb L: 67/74 MS: 1 ChangeBit- 00:08:47.782 [2024-12-05 12:50:50.926871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.782 [2024-12-05 12:50:50.926901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.782 [2024-12-05 12:50:50.926978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.782 [2024-12-05 12:50:50.926998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.782 [2024-12-05 12:50:50.927112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.782 [2024-12-05 12:50:50.927131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.782 #53 NEW cov: 12421 ft: 14932 corp: 31/1546b lim: 100 exec/s: 53 rss: 74Mb L: 63/74 MS: 1 ChangeByte- 00:08:47.782 [2024-12-05 12:50:50.986970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.782 [2024-12-05 12:50:50.986997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.783 [2024-12-05 12:50:50.987090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:47.783 [2024-12-05 12:50:50.987111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.783 [2024-12-05 12:50:50.987223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:47.783 [2024-12-05 12:50:50.987244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.783 #54 NEW cov: 12421 ft: 14946 corp: 32/1618b lim: 100 exec/s: 54 rss: 74Mb L: 72/74 MS: 1 CrossOver- 00:08:47.783 [2024-12-05 12:50:51.046848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:47.783 [2024-12-05 12:50:51.046873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.783 #55 NEW cov: 12421 ft: 14954 corp: 33/1657b lim: 100 exec/s: 27 rss: 74Mb L: 39/74 MS: 1 InsertByte- 00:08:47.783 #55 DONE cov: 12421 ft: 14954 corp: 33/1657b lim: 100 exec/s: 27 rss: 74Mb 00:08:47.783 ###### Recommended dictionary. ###### 00:08:47.783 "\007\000\000\000\000\000\000\000" # Uses: 0 00:08:47.783 "\024\325V\341" # Uses: 0 00:08:47.783 "\377\001\000\000" # Uses: 1 00:08:47.783 ###### End of recommended dictionary. ###### 00:08:47.783 Done 55 runs in 2 second(s) 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:48.058 12:50:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:48.058 [2024-12-05 12:50:51.213148] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:48.059 [2024-12-05 12:50:51.213215] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157072 ] 00:08:48.348 [2024-12-05 12:50:51.411351] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.348 [2024-12-05 12:50:51.424862] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.348 [2024-12-05 12:50:51.477480] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:48.348 [2024-12-05 12:50:51.493783] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:48.348 INFO: Running with entropic power schedule (0xFF, 100). 00:08:48.348 INFO: Seed: 17859404 00:08:48.348 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:48.348 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:48.348 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:48.348 INFO: A corpus is not provided, starting from an empty corpus 00:08:48.348 #2 INITED exec/s: 0 rss: 65Mb 00:08:48.348 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:48.348 This may also happen if the target rejected all inputs we tried so far 00:08:48.348 [2024-12-05 12:50:51.538999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4557430888798830399 len:16192 00:08:48.348 [2024-12-05 12:50:51.539028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.631 NEW_FUNC[1/716]: 0x472f18 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:48.631 NEW_FUNC[2/716]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.631 #5 NEW cov: 12173 ft: 12167 corp: 2/14b lim: 50 exec/s: 0 rss: 72Mb L: 13/13 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:08:48.631 [2024-12-05 12:50:51.870271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281602 len:49859 00:08:48.631 [2024-12-05 12:50:51.870310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.631 [2024-12-05 12:50:51.870367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586874562 len:49859 00:08:48.631 [2024-12-05 12:50:51.870386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.631 [2024-12-05 12:50:51.870444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993530586874562 len:49859 00:08:48.632 [2024-12-05 12:50:51.870462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.632 [2024-12-05 12:50:51.870519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033993530586874562 len:49859 00:08:48.632 [2024-12-05 12:50:51.870538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.632 #8 NEW cov: 12286 ft: 13062 corp: 3/57b lim: 50 exec/s: 0 rss: 72Mb L: 43/43 MS: 3 ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:08:48.632 [2024-12-05 12:50:51.909882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:48.632 [2024-12-05 12:50:51.909914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.632 #12 NEW cov: 12292 ft: 13428 corp: 4/73b lim: 50 exec/s: 0 rss: 72Mb L: 16/43 MS: 4 ChangeByte-CrossOver-ChangeBit-InsertRepeatedBytes- 00:08:48.914 [2024-12-05 12:50:51.950374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281602 len:49859 00:08:48.914 [2024-12-05 12:50:51.950402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:51.950451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586874562 len:49859 00:08:48.914 [2024-12-05 12:50:51.950467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:51.950521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993530586874562 len:49859 00:08:48.914 [2024-12-05 12:50:51.950538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:51.950595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033993530586874562 len:49859 00:08:48.914 [2024-12-05 12:50:51.950610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.914 #13 NEW cov: 12377 ft: 13764 corp: 5/116b lim: 50 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 ShuffleBytes- 00:08:48.914 [2024-12-05 12:50:52.010177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:48.914 [2024-12-05 12:50:52.010207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.914 #14 NEW cov: 12377 ft: 13844 corp: 6/135b lim: 50 exec/s: 0 rss: 72Mb L: 19/43 MS: 1 CopyPart- 00:08:48.914 [2024-12-05 12:50:52.070700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281602 len:49859 00:08:48.914 [2024-12-05 12:50:52.070733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:52.070767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586874562 len:49859 00:08:48.914 [2024-12-05 12:50:52.070782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:52.070839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993531056636610 len:49859 00:08:48.914 [2024-12-05 12:50:52.070855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:52.070912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033993530586874562 len:49859 00:08:48.914 [2024-12-05 12:50:52.070928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.914 #15 NEW cov: 12377 ft: 13896 corp: 7/179b lim: 50 exec/s: 0 rss: 73Mb L: 44/44 MS: 1 InsertByte- 00:08:48.914 [2024-12-05 12:50:52.130868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281602 len:49859 00:08:48.914 [2024-12-05 12:50:52.130896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:52.130942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586836162 len:49859 00:08:48.914 [2024-12-05 12:50:52.130958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:52.131012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993531056636610 len:49859 00:08:48.914 [2024-12-05 12:50:52.131027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:52.131085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033993530586874562 len:49859 00:08:48.914 [2024-12-05 12:50:52.131100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.914 #16 NEW cov: 12377 ft: 14002 corp: 8/223b lim: 50 exec/s: 0 rss: 73Mb L: 44/44 MS: 1 ChangeBinInt- 00:08:48.914 [2024-12-05 12:50:52.190888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4557430888798830399 len:16192 00:08:48.914 [2024-12-05 12:50:52.190916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:52.190952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:940422247733857549 len:3342 00:08:48.914 [2024-12-05 12:50:52.190968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.914 [2024-12-05 12:50:52.191024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:940422246894996749 len:3342 00:08:48.914 [2024-12-05 12:50:52.191041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.196 #17 NEW cov: 12377 ft: 14332 corp: 9/259b lim: 50 exec/s: 0 rss: 73Mb L: 36/44 MS: 1 InsertRepeatedBytes- 00:08:49.196 [2024-12-05 12:50:52.250826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:49.196 [2024-12-05 12:50:52.250860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.196 #18 NEW cov: 12377 ft: 14401 corp: 10/275b lim: 50 exec/s: 0 rss: 73Mb L: 16/44 MS: 1 ChangeBit- 00:08:49.196 [2024-12-05 12:50:52.291287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:584655537191240386 len:7454 00:08:49.196 [2024-12-05 12:50:52.291315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.291352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2097865012304223517 len:7454 00:08:49.196 [2024-12-05 12:50:52.291367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.291420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2097865012304223517 len:7454 00:08:49.196 [2024-12-05 12:50:52.291436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.291490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2097865012304223517 len:7454 00:08:49.196 [2024-12-05 12:50:52.291506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.196 #21 NEW cov: 12377 ft: 14451 corp: 11/323b lim: 50 exec/s: 0 rss: 73Mb L: 48/48 MS: 3 CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:08:49.196 [2024-12-05 12:50:52.331198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:49.196 [2024-12-05 12:50:52.331226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.331282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551410 len:16683 00:08:49.196 [2024-12-05 12:50:52.331298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.196 #22 NEW cov: 12377 ft: 14706 corp: 12/343b lim: 50 exec/s: 0 rss: 73Mb L: 20/48 MS: 1 InsertByte- 00:08:49.196 [2024-12-05 12:50:52.391587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281453 len:49859 00:08:49.196 [2024-12-05 12:50:52.391613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.391659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586836162 len:49859 00:08:49.196 [2024-12-05 12:50:52.391674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.391731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993531056636610 len:49859 00:08:49.196 [2024-12-05 12:50:52.391746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.391800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033993530586874562 len:49859 00:08:49.196 [2024-12-05 12:50:52.391816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.196 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:49.196 #23 NEW cov: 12400 ft: 14727 corp: 13/387b lim: 50 exec/s: 0 rss: 73Mb L: 44/48 MS: 1 ChangeByte- 00:08:49.196 [2024-12-05 12:50:52.451742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281453 len:49859 00:08:49.196 [2024-12-05 12:50:52.451770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.451808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586836162 len:49859 00:08:49.196 [2024-12-05 12:50:52.451823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.451878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993531056636610 len:49859 00:08:49.196 [2024-12-05 12:50:52.451894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.196 [2024-12-05 12:50:52.451950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033779392107430594 len:45 00:08:49.196 [2024-12-05 12:50:52.451965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.196 #29 NEW cov: 12400 ft: 14804 corp: 14/431b lim: 50 exec/s: 0 rss: 73Mb L: 44/48 MS: 1 ChangeBinInt- 00:08:49.471 [2024-12-05 12:50:52.511703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:49.471 [2024-12-05 12:50:52.511730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.511764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:144680349937369602 len:65536 00:08:49.471 [2024-12-05 12:50:52.511780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.471 #30 NEW cov: 12400 ft: 14840 corp: 15/453b lim: 50 exec/s: 30 rss: 73Mb L: 22/48 MS: 1 InsertRepeatedBytes- 00:08:49.471 [2024-12-05 12:50:52.551926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281602 len:49859 00:08:49.471 [2024-12-05 12:50:52.551956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.551990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586874562 len:49859 00:08:49.471 [2024-12-05 12:50:52.552006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.552062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993530586874562 len:49859 00:08:49.471 [2024-12-05 12:50:52.552079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.471 #31 NEW cov: 12400 ft: 14887 corp: 16/484b lim: 50 exec/s: 31 rss: 73Mb L: 31/48 MS: 1 EraseBytes- 00:08:49.471 [2024-12-05 12:50:52.592172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281453 len:49859 00:08:49.471 [2024-12-05 12:50:52.592200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.592247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586836162 len:49859 00:08:49.471 [2024-12-05 12:50:52.592263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.592320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033992736487686850 len:49859 00:08:49.471 [2024-12-05 12:50:52.592337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.592389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033993530586874562 len:49859 00:08:49.471 [2024-12-05 12:50:52.592407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.471 #32 NEW cov: 12400 ft: 14900 corp: 17/528b lim: 50 exec/s: 32 rss: 73Mb L: 44/48 MS: 1 ChangeByte- 00:08:49.471 [2024-12-05 12:50:52.631945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:49.471 [2024-12-05 12:50:52.631973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.471 #33 NEW cov: 12400 ft: 14917 corp: 18/543b lim: 50 exec/s: 33 rss: 73Mb L: 15/48 MS: 1 EraseBytes- 00:08:49.471 [2024-12-05 12:50:52.672410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281453 len:49859 00:08:49.471 [2024-12-05 12:50:52.672437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.672486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586836162 len:49887 00:08:49.471 [2024-12-05 12:50:52.672501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.672559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993530586874562 len:49859 00:08:49.471 [2024-12-05 12:50:52.672575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.672632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13979173246625563330 len:11459 00:08:49.471 [2024-12-05 12:50:52.672648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.471 #34 NEW cov: 12400 ft: 14937 corp: 19/586b lim: 50 exec/s: 34 rss: 73Mb L: 43/48 MS: 1 EraseBytes- 00:08:49.471 [2024-12-05 12:50:52.732473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281602 len:49859 00:08:49.471 [2024-12-05 12:50:52.732499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.732545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586874562 len:49710 00:08:49.471 [2024-12-05 12:50:52.732562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.471 [2024-12-05 12:50:52.732617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993530586874562 len:49859 00:08:49.471 [2024-12-05 12:50:52.732633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.471 #35 NEW cov: 12400 ft: 14956 corp: 20/617b lim: 50 exec/s: 35 rss: 73Mb L: 31/48 MS: 1 ChangeByte- 00:08:49.762 [2024-12-05 12:50:52.792395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65346 00:08:49.762 [2024-12-05 12:50:52.792424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.762 #36 NEW cov: 12400 ft: 14964 corp: 21/628b lim: 50 exec/s: 36 rss: 73Mb L: 11/48 MS: 1 EraseBytes- 00:08:49.762 [2024-12-05 12:50:52.832501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:576742227280134143 len:65536 00:08:49.762 [2024-12-05 12:50:52.832530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.762 #37 NEW cov: 12400 ft: 14990 corp: 22/646b lim: 50 exec/s: 37 rss: 74Mb L: 18/48 MS: 1 CMP- DE: "\010\000"- 00:08:49.762 [2024-12-05 12:50:52.892643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:576742227280134143 len:65536 00:08:49.762 [2024-12-05 12:50:52.892674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.762 #38 NEW cov: 12400 ft: 15062 corp: 23/664b lim: 50 exec/s: 38 rss: 74Mb L: 18/48 MS: 1 CrossOver- 00:08:49.762 [2024-12-05 12:50:52.952807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:49.762 [2024-12-05 12:50:52.952837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.762 #39 NEW cov: 12400 ft: 15089 corp: 24/683b lim: 50 exec/s: 39 rss: 74Mb L: 19/48 MS: 1 CrossOver- 00:08:49.762 [2024-12-05 12:50:52.992927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4557430888798830399 len:16192 00:08:49.762 [2024-12-05 12:50:52.992956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.762 #40 NEW cov: 12400 ft: 15106 corp: 25/696b lim: 50 exec/s: 40 rss: 74Mb L: 13/48 MS: 1 CopyPart- 00:08:49.762 [2024-12-05 12:50:53.033062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4557430888798830399 len:16192 00:08:49.762 [2024-12-05 12:50:53.033090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.762 #41 NEW cov: 12400 ft: 15155 corp: 26/709b lim: 50 exec/s: 41 rss: 74Mb L: 13/48 MS: 1 ShuffleBytes- 00:08:50.046 [2024-12-05 12:50:53.073219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:576742227280134143 len:65536 00:08:50.046 [2024-12-05 12:50:53.073247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.046 #42 NEW cov: 12400 ft: 15172 corp: 27/727b lim: 50 exec/s: 42 rss: 74Mb L: 18/48 MS: 1 PersAutoDict- DE: "\010\000"- 00:08:50.046 [2024-12-05 12:50:53.113303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073608888319 len:65536 00:08:50.046 [2024-12-05 12:50:53.113331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.046 #43 NEW cov: 12400 ft: 15177 corp: 28/742b lim: 50 exec/s: 43 rss: 74Mb L: 15/48 MS: 1 ChangeBinInt- 00:08:50.046 [2024-12-05 12:50:53.173906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281453 len:49859 00:08:50.046 [2024-12-05 12:50:53.173934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.046 [2024-12-05 12:50:53.173985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586836162 len:49859 00:08:50.046 [2024-12-05 12:50:53.174001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.046 [2024-12-05 12:50:53.174055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033992736487686850 len:49859 00:08:50.046 [2024-12-05 12:50:53.174071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.046 [2024-12-05 12:50:53.174126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033993530586874562 len:49859 00:08:50.046 [2024-12-05 12:50:53.174143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.046 [2024-12-05 12:50:53.174198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:15119096122905383377 len:49859 00:08:50.046 [2024-12-05 12:50:53.174214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:50.046 #44 NEW cov: 12400 ft: 15212 corp: 29/792b lim: 50 exec/s: 44 rss: 74Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:50.046 [2024-12-05 12:50:53.233637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446534251672240127 len:16192 00:08:50.046 [2024-12-05 12:50:53.233664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.046 #46 NEW cov: 12400 ft: 15221 corp: 30/808b lim: 50 exec/s: 46 rss: 74Mb L: 16/50 MS: 2 EraseBytes-CrossOver- 00:08:50.046 [2024-12-05 12:50:53.293920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:50.046 [2024-12-05 12:50:53.293946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.046 [2024-12-05 12:50:53.293982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3674937292394332159 len:65346 00:08:50.046 [2024-12-05 12:50:53.293998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.046 #47 NEW cov: 12400 ft: 15223 corp: 31/829b lim: 50 exec/s: 47 rss: 74Mb L: 21/50 MS: 1 InsertByte- 00:08:50.046 [2024-12-05 12:50:53.354216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281602 len:49859 00:08:50.046 [2024-12-05 12:50:53.354246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.046 [2024-12-05 12:50:53.354281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446743810689073407 len:49710 00:08:50.046 [2024-12-05 12:50:53.354296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.046 [2024-12-05 12:50:53.354351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993530586874562 len:49859 00:08:50.046 [2024-12-05 12:50:53.354367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.336 #48 NEW cov: 12400 ft: 15248 corp: 32/860b lim: 50 exec/s: 48 rss: 74Mb L: 31/50 MS: 1 CrossOver- 00:08:50.336 [2024-12-05 12:50:53.414115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:50.336 [2024-12-05 12:50:53.414144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.336 #49 NEW cov: 12400 ft: 15291 corp: 33/879b lim: 50 exec/s: 49 rss: 74Mb L: 19/50 MS: 1 CopyPart- 00:08:50.336 [2024-12-05 12:50:53.474628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14033993527491281602 len:49859 00:08:50.336 [2024-12-05 12:50:53.474656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.336 [2024-12-05 12:50:53.474698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586836162 len:49859 00:08:50.336 [2024-12-05 12:50:53.474714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.336 [2024-12-05 12:50:53.474769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993531056636610 len:49859 00:08:50.336 [2024-12-05 12:50:53.474785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.336 [2024-12-05 12:50:53.474845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033993530586874562 len:49859 00:08:50.336 [2024-12-05 12:50:53.474861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.336 #55 NEW cov: 12400 ft: 15319 corp: 34/923b lim: 50 exec/s: 55 rss: 74Mb L: 44/50 MS: 1 ChangeBit- 00:08:50.336 [2024-12-05 12:50:53.514735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3297412015840019014 len:49859 00:08:50.336 [2024-12-05 12:50:53.514762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.336 [2024-12-05 12:50:53.514809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14033993530586874412 len:49859 00:08:50.336 [2024-12-05 12:50:53.514825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.336 [2024-12-05 12:50:53.514885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14033993531056636610 len:49859 00:08:50.336 [2024-12-05 12:50:53.514902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.336 [2024-12-05 12:50:53.514959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14033779392107430594 len:45 00:08:50.336 [2024-12-05 12:50:53.514976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.336 #56 NEW cov: 12400 ft: 15323 corp: 35/967b lim: 50 exec/s: 28 rss: 74Mb L: 44/50 MS: 1 InsertByte- 00:08:50.336 #56 DONE cov: 12400 ft: 15323 corp: 35/967b lim: 50 exec/s: 28 rss: 74Mb 00:08:50.336 ###### Recommended dictionary. ###### 00:08:50.336 "\010\000" # Uses: 1 00:08:50.336 ###### End of recommended dictionary. ###### 00:08:50.336 Done 56 runs in 2 second(s) 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:50.621 12:50:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:50.621 [2024-12-05 12:50:53.700469] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:50.621 [2024-12-05 12:50:53.700536] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157528 ] 00:08:50.621 [2024-12-05 12:50:53.897429] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.621 [2024-12-05 12:50:53.910718] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.892 [2024-12-05 12:50:53.963502] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:50.892 [2024-12-05 12:50:53.979853] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:50.892 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.892 INFO: Seed: 2502856709 00:08:50.892 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:50.892 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:50.892 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:50.892 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.892 #2 INITED exec/s: 0 rss: 64Mb 00:08:50.892 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.892 This may also happen if the target rejected all inputs we tried so far 00:08:50.893 [2024-12-05 12:50:54.035039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:50.893 [2024-12-05 12:50:54.035068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.186 NEW_FUNC[1/718]: 0x474ad8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:51.186 NEW_FUNC[2/718]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:51.186 #11 NEW cov: 12231 ft: 12201 corp: 2/24b lim: 90 exec/s: 0 rss: 72Mb L: 23/23 MS: 4 CrossOver-CMP-InsertRepeatedBytes-CMP- DE: "\010\000\000\000"-"\377\377\377\377\377\377\377)"- 00:08:51.186 [2024-12-05 12:50:54.367170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.186 [2024-12-05 12:50:54.367221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.186 [2024-12-05 12:50:54.367349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.186 [2024-12-05 12:50:54.367381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.186 [2024-12-05 12:50:54.367517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:51.186 [2024-12-05 12:50:54.367541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.186 #15 NEW cov: 12344 ft: 13690 corp: 3/89b lim: 90 exec/s: 0 rss: 72Mb L: 65/65 MS: 4 InsertByte-ChangeBinInt-EraseBytes-InsertRepeatedBytes- 00:08:51.186 [2024-12-05 12:50:54.426607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.186 [2024-12-05 12:50:54.426634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.186 #18 NEW cov: 12350 ft: 13864 corp: 4/123b lim: 90 exec/s: 0 rss: 72Mb L: 34/65 MS: 3 CrossOver-CrossOver-CrossOver- 00:08:51.455 [2024-12-05 12:50:54.476806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.455 [2024-12-05 12:50:54.476840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.455 #20 NEW cov: 12435 ft: 14110 corp: 5/148b lim: 90 exec/s: 0 rss: 72Mb L: 25/65 MS: 2 CMP-InsertRepeatedBytes- DE: "\014\000\000\000\000\000\000\000"- 00:08:51.455 [2024-12-05 12:50:54.527390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.455 [2024-12-05 12:50:54.527426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.455 [2024-12-05 12:50:54.527561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.455 [2024-12-05 12:50:54.527582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.455 [2024-12-05 12:50:54.527707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:51.455 [2024-12-05 12:50:54.527727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.455 #21 NEW cov: 12435 ft: 14274 corp: 6/213b lim: 90 exec/s: 0 rss: 72Mb L: 65/65 MS: 1 CrossOver- 00:08:51.455 [2024-12-05 12:50:54.597097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.455 [2024-12-05 12:50:54.597124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.455 #22 NEW cov: 12435 ft: 14357 corp: 7/237b lim: 90 exec/s: 0 rss: 72Mb L: 24/65 MS: 1 InsertByte- 00:08:51.455 [2024-12-05 12:50:54.667382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.455 [2024-12-05 12:50:54.667407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.455 #23 NEW cov: 12435 ft: 14439 corp: 8/271b lim: 90 exec/s: 0 rss: 72Mb L: 34/65 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:08:51.455 [2024-12-05 12:50:54.727529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.455 [2024-12-05 12:50:54.727560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.732 #24 NEW cov: 12435 ft: 14483 corp: 9/305b lim: 90 exec/s: 0 rss: 72Mb L: 34/65 MS: 1 ChangeBinInt- 00:08:51.732 [2024-12-05 12:50:54.797718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.732 [2024-12-05 12:50:54.797751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.732 #25 NEW cov: 12435 ft: 14566 corp: 10/330b lim: 90 exec/s: 0 rss: 72Mb L: 25/65 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:08:51.732 [2024-12-05 12:50:54.867852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.732 [2024-12-05 12:50:54.867888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.732 #31 NEW cov: 12435 ft: 14609 corp: 11/356b lim: 90 exec/s: 0 rss: 73Mb L: 26/65 MS: 1 EraseBytes- 00:08:51.732 [2024-12-05 12:50:54.938645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.732 [2024-12-05 12:50:54.938681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.732 [2024-12-05 12:50:54.938785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.732 [2024-12-05 12:50:54.938807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.732 [2024-12-05 12:50:54.938942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:51.732 [2024-12-05 12:50:54.938966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.732 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:51.732 #32 NEW cov: 12458 ft: 14658 corp: 12/421b lim: 90 exec/s: 0 rss: 73Mb L: 65/65 MS: 1 ShuffleBytes- 00:08:51.732 [2024-12-05 12:50:55.008818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:51.733 [2024-12-05 12:50:55.008856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.733 [2024-12-05 12:50:55.008965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:51.733 [2024-12-05 12:50:55.008986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.733 [2024-12-05 12:50:55.009122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:51.733 [2024-12-05 12:50:55.009142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.733 #33 NEW cov: 12458 ft: 14680 corp: 13/486b lim: 90 exec/s: 33 rss: 73Mb L: 65/65 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:08:52.017 [2024-12-05 12:50:55.058621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.017 [2024-12-05 12:50:55.058647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.017 #34 NEW cov: 12458 ft: 14702 corp: 14/520b lim: 90 exec/s: 34 rss: 73Mb L: 34/65 MS: 1 ChangeBinInt- 00:08:52.017 [2024-12-05 12:50:55.109000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.017 [2024-12-05 12:50:55.109029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.017 [2024-12-05 12:50:55.109166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.017 [2024-12-05 12:50:55.109187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.017 #35 NEW cov: 12458 ft: 14999 corp: 15/572b lim: 90 exec/s: 35 rss: 73Mb L: 52/65 MS: 1 EraseBytes- 00:08:52.017 [2024-12-05 12:50:55.179695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.017 [2024-12-05 12:50:55.179727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.017 [2024-12-05 12:50:55.179820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.017 [2024-12-05 12:50:55.179846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.017 [2024-12-05 12:50:55.179970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:52.017 [2024-12-05 12:50:55.179992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.017 [2024-12-05 12:50:55.180105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:52.017 [2024-12-05 12:50:55.180129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.017 #36 NEW cov: 12458 ft: 15373 corp: 16/657b lim: 90 exec/s: 36 rss: 73Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:08:52.017 [2024-12-05 12:50:55.229186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.017 [2024-12-05 12:50:55.229213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.017 #37 NEW cov: 12458 ft: 15457 corp: 17/680b lim: 90 exec/s: 37 rss: 73Mb L: 23/85 MS: 1 ChangeBinInt- 00:08:52.017 [2024-12-05 12:50:55.279199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.017 [2024-12-05 12:50:55.279231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.017 #38 NEW cov: 12458 ft: 15495 corp: 18/714b lim: 90 exec/s: 38 rss: 73Mb L: 34/85 MS: 1 CrossOver- 00:08:52.300 [2024-12-05 12:50:55.329845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.300 [2024-12-05 12:50:55.329880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.300 [2024-12-05 12:50:55.329982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.300 [2024-12-05 12:50:55.330005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.300 [2024-12-05 12:50:55.330123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:52.300 [2024-12-05 12:50:55.330142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.300 #39 NEW cov: 12458 ft: 15517 corp: 19/768b lim: 90 exec/s: 39 rss: 73Mb L: 54/85 MS: 1 EraseBytes- 00:08:52.300 [2024-12-05 12:50:55.379562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.300 [2024-12-05 12:50:55.379594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.300 #40 NEW cov: 12458 ft: 15528 corp: 20/797b lim: 90 exec/s: 40 rss: 73Mb L: 29/85 MS: 1 CrossOver- 00:08:52.300 [2024-12-05 12:50:55.450090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.300 [2024-12-05 12:50:55.450122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.300 [2024-12-05 12:50:55.450240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.300 [2024-12-05 12:50:55.450261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.300 #41 NEW cov: 12458 ft: 15574 corp: 21/850b lim: 90 exec/s: 41 rss: 73Mb L: 53/85 MS: 1 InsertByte- 00:08:52.300 [2024-12-05 12:50:55.520230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.300 [2024-12-05 12:50:55.520264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.300 [2024-12-05 12:50:55.520382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.300 [2024-12-05 12:50:55.520404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.300 #42 NEW cov: 12458 ft: 15654 corp: 22/891b lim: 90 exec/s: 42 rss: 73Mb L: 41/85 MS: 1 CopyPart- 00:08:52.300 [2024-12-05 12:50:55.570079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.300 [2024-12-05 12:50:55.570112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.610 #48 NEW cov: 12458 ft: 15722 corp: 23/920b lim: 90 exec/s: 48 rss: 73Mb L: 29/85 MS: 1 CrossOver- 00:08:52.610 [2024-12-05 12:50:55.640369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.610 [2024-12-05 12:50:55.640400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.610 #49 NEW cov: 12458 ft: 15724 corp: 24/954b lim: 90 exec/s: 49 rss: 73Mb L: 34/85 MS: 1 ChangeBit- 00:08:52.610 [2024-12-05 12:50:55.710586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.610 [2024-12-05 12:50:55.710617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.610 #50 NEW cov: 12458 ft: 15734 corp: 25/988b lim: 90 exec/s: 50 rss: 73Mb L: 34/85 MS: 1 ChangeByte- 00:08:52.610 [2024-12-05 12:50:55.760685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.610 [2024-12-05 12:50:55.760715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.610 #51 NEW cov: 12458 ft: 15740 corp: 26/1012b lim: 90 exec/s: 51 rss: 73Mb L: 24/85 MS: 1 ChangeBinInt- 00:08:52.610 [2024-12-05 12:50:55.830786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.610 [2024-12-05 12:50:55.830812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.610 #52 NEW cov: 12458 ft: 15744 corp: 27/1044b lim: 90 exec/s: 52 rss: 74Mb L: 32/85 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:08:52.902 [2024-12-05 12:50:55.901430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.902 [2024-12-05 12:50:55.901464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.902 [2024-12-05 12:50:55.901568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.902 [2024-12-05 12:50:55.901587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.902 [2024-12-05 12:50:55.901715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:52.902 [2024-12-05 12:50:55.901738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.902 #53 NEW cov: 12458 ft: 15778 corp: 28/1110b lim: 90 exec/s: 53 rss: 74Mb L: 66/85 MS: 1 InsertByte- 00:08:52.902 [2024-12-05 12:50:55.951240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.902 [2024-12-05 12:50:55.951266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.902 #54 NEW cov: 12458 ft: 15786 corp: 29/1134b lim: 90 exec/s: 54 rss: 74Mb L: 24/85 MS: 1 ChangeBinInt- 00:08:52.902 [2024-12-05 12:50:56.001648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:52.902 [2024-12-05 12:50:56.001681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.902 [2024-12-05 12:50:56.001805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:52.902 [2024-12-05 12:50:56.001823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.902 #55 NEW cov: 12458 ft: 15805 corp: 30/1175b lim: 90 exec/s: 27 rss: 74Mb L: 41/85 MS: 1 CrossOver- 00:08:52.902 #55 DONE cov: 12458 ft: 15805 corp: 30/1175b lim: 90 exec/s: 27 rss: 74Mb 00:08:52.902 ###### Recommended dictionary. ###### 00:08:52.902 "\010\000\000\000" # Uses: 0 00:08:52.902 "\377\377\377\377\377\377\377)" # Uses: 0 00:08:52.902 "\014\000\000\000\000\000\000\000" # Uses: 4 00:08:52.902 ###### End of recommended dictionary. ###### 00:08:52.902 Done 55 runs in 2 second(s) 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:52.902 12:50:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:52.902 [2024-12-05 12:50:56.185067] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:52.902 [2024-12-05 12:50:56.185148] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid157919 ] 00:08:53.177 [2024-12-05 12:50:56.383740] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.177 [2024-12-05 12:50:56.395885] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.177 [2024-12-05 12:50:56.448668] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:53.177 [2024-12-05 12:50:56.464951] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:53.177 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.177 INFO: Seed: 693886037 00:08:53.459 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:53.459 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:53.459 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:53.459 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.459 #2 INITED exec/s: 0 rss: 64Mb 00:08:53.459 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.459 This may also happen if the target rejected all inputs we tried so far 00:08:53.459 [2024-12-05 12:50:56.530525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:53.459 [2024-12-05 12:50:56.530557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.459 [2024-12-05 12:50:56.530610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:53.459 [2024-12-05 12:50:56.530626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.459 [2024-12-05 12:50:56.530682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:53.459 [2024-12-05 12:50:56.530699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.750 NEW_FUNC[1/718]: 0x477d08 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:53.750 NEW_FUNC[2/718]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:53.750 #6 NEW cov: 12185 ft: 12187 corp: 2/38b lim: 50 exec/s: 0 rss: 72Mb L: 37/37 MS: 4 ChangeBit-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:53.750 [2024-12-05 12:50:56.862024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:53.750 [2024-12-05 12:50:56.862112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:56.862230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:53.750 [2024-12-05 12:50:56.862277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:56.862389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:53.750 [2024-12-05 12:50:56.862432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:56.862542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:53.750 [2024-12-05 12:50:56.862587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.750 #10 NEW cov: 12318 ft: 13196 corp: 3/80b lim: 50 exec/s: 0 rss: 72Mb L: 42/42 MS: 4 ChangeBinInt-InsertByte-CrossOver-InsertRepeatedBytes- 00:08:53.750 [2024-12-05 12:50:56.911450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:53.750 [2024-12-05 12:50:56.911480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:56.911519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:53.750 [2024-12-05 12:50:56.911535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:56.911589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:53.750 [2024-12-05 12:50:56.911606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.750 #11 NEW cov: 12324 ft: 13352 corp: 4/117b lim: 50 exec/s: 0 rss: 72Mb L: 37/42 MS: 1 ChangeBit- 00:08:53.750 [2024-12-05 12:50:56.971605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:53.750 [2024-12-05 12:50:56.971634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:56.971669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:53.750 [2024-12-05 12:50:56.971685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:56.971741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:53.750 [2024-12-05 12:50:56.971758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.750 #12 NEW cov: 12409 ft: 13558 corp: 5/155b lim: 50 exec/s: 0 rss: 73Mb L: 38/42 MS: 1 InsertByte- 00:08:53.750 [2024-12-05 12:50:57.031760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:53.750 [2024-12-05 12:50:57.031788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:57.031824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:53.750 [2024-12-05 12:50:57.031847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.750 [2024-12-05 12:50:57.031902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:53.750 [2024-12-05 12:50:57.031921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.750 #13 NEW cov: 12409 ft: 13739 corp: 6/192b lim: 50 exec/s: 0 rss: 73Mb L: 37/42 MS: 1 ShuffleBytes- 00:08:54.047 [2024-12-05 12:50:57.071872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.047 [2024-12-05 12:50:57.071900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.071935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.047 [2024-12-05 12:50:57.071951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.072006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.047 [2024-12-05 12:50:57.072022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.047 #14 NEW cov: 12409 ft: 13902 corp: 7/230b lim: 50 exec/s: 0 rss: 73Mb L: 38/42 MS: 1 InsertByte- 00:08:54.047 [2024-12-05 12:50:57.132038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.047 [2024-12-05 12:50:57.132066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.132109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.047 [2024-12-05 12:50:57.132123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.132177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.047 [2024-12-05 12:50:57.132192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.047 #15 NEW cov: 12409 ft: 13944 corp: 8/267b lim: 50 exec/s: 0 rss: 73Mb L: 37/42 MS: 1 ChangeByte- 00:08:54.047 [2024-12-05 12:50:57.172263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.047 [2024-12-05 12:50:57.172290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.172337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.047 [2024-12-05 12:50:57.172352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.172408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.047 [2024-12-05 12:50:57.172423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.172476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.047 [2024-12-05 12:50:57.172490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.047 #16 NEW cov: 12409 ft: 13992 corp: 9/309b lim: 50 exec/s: 0 rss: 73Mb L: 42/42 MS: 1 CrossOver- 00:08:54.047 [2024-12-05 12:50:57.232468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.047 [2024-12-05 12:50:57.232495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.232540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.047 [2024-12-05 12:50:57.232556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.232613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.047 [2024-12-05 12:50:57.232629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.232684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.047 [2024-12-05 12:50:57.232700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.047 #20 NEW cov: 12409 ft: 14064 corp: 10/350b lim: 50 exec/s: 0 rss: 73Mb L: 41/42 MS: 4 CopyPart-ChangeBinInt-CMP-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:08:54.047 [2024-12-05 12:50:57.272574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.047 [2024-12-05 12:50:57.272602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.272643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.047 [2024-12-05 12:50:57.272659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.272715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.047 [2024-12-05 12:50:57.272732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.047 [2024-12-05 12:50:57.272787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.047 [2024-12-05 12:50:57.272803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.047 #21 NEW cov: 12409 ft: 14137 corp: 11/392b lim: 50 exec/s: 0 rss: 73Mb L: 42/42 MS: 1 ChangeBinInt- 00:08:54.047 [2024-12-05 12:50:57.332770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.048 [2024-12-05 12:50:57.332798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.048 [2024-12-05 12:50:57.332839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.048 [2024-12-05 12:50:57.332856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.048 [2024-12-05 12:50:57.332913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.048 [2024-12-05 12:50:57.332930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.048 [2024-12-05 12:50:57.332984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.048 [2024-12-05 12:50:57.333000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.324 #22 NEW cov: 12409 ft: 14202 corp: 12/433b lim: 50 exec/s: 0 rss: 73Mb L: 41/42 MS: 1 InsertRepeatedBytes- 00:08:54.324 [2024-12-05 12:50:57.392904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.324 [2024-12-05 12:50:57.392931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.392976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.324 [2024-12-05 12:50:57.392991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.393044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.324 [2024-12-05 12:50:57.393063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.393118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.324 [2024-12-05 12:50:57.393132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.324 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:54.324 #23 NEW cov: 12432 ft: 14325 corp: 13/478b lim: 50 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 CMP- DE: "\001\230\202\363\177\030u$"- 00:08:54.324 [2024-12-05 12:50:57.433001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.324 [2024-12-05 12:50:57.433028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.433072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.324 [2024-12-05 12:50:57.433087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.433141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.324 [2024-12-05 12:50:57.433157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.433212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.324 [2024-12-05 12:50:57.433228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.324 #24 NEW cov: 12432 ft: 14349 corp: 14/519b lim: 50 exec/s: 0 rss: 73Mb L: 41/45 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:54.324 [2024-12-05 12:50:57.493197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.324 [2024-12-05 12:50:57.493223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.493270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.324 [2024-12-05 12:50:57.493286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.493342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.324 [2024-12-05 12:50:57.493359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.493413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.324 [2024-12-05 12:50:57.493428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.324 #25 NEW cov: 12432 ft: 14415 corp: 15/566b lim: 50 exec/s: 25 rss: 73Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:54.324 [2024-12-05 12:50:57.533283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.324 [2024-12-05 12:50:57.533311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.533346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.324 [2024-12-05 12:50:57.533362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.533418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.324 [2024-12-05 12:50:57.533434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.533494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.324 [2024-12-05 12:50:57.533509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.324 #26 NEW cov: 12432 ft: 14416 corp: 16/607b lim: 50 exec/s: 26 rss: 73Mb L: 41/47 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:54.324 [2024-12-05 12:50:57.593442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.324 [2024-12-05 12:50:57.593470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.593515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.324 [2024-12-05 12:50:57.593531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.593586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.324 [2024-12-05 12:50:57.593602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.324 [2024-12-05 12:50:57.593658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.324 [2024-12-05 12:50:57.593674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.598 #27 NEW cov: 12432 ft: 14436 corp: 17/654b lim: 50 exec/s: 27 rss: 73Mb L: 47/47 MS: 1 CopyPart- 00:08:54.598 [2024-12-05 12:50:57.653627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.598 [2024-12-05 12:50:57.653655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.653696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.598 [2024-12-05 12:50:57.653711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.653768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.598 [2024-12-05 12:50:57.653784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.653848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.598 [2024-12-05 12:50:57.653864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.598 #28 NEW cov: 12432 ft: 14456 corp: 18/701b lim: 50 exec/s: 28 rss: 74Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:54.598 [2024-12-05 12:50:57.693542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.598 [2024-12-05 12:50:57.693570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.693613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.598 [2024-12-05 12:50:57.693627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.693681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.598 [2024-12-05 12:50:57.693697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.598 #29 NEW cov: 12432 ft: 14471 corp: 19/737b lim: 50 exec/s: 29 rss: 74Mb L: 36/47 MS: 1 EraseBytes- 00:08:54.598 [2024-12-05 12:50:57.733784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.598 [2024-12-05 12:50:57.733815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.733858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.598 [2024-12-05 12:50:57.733873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.733926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.598 [2024-12-05 12:50:57.733941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.733996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.598 [2024-12-05 12:50:57.734011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.598 #30 NEW cov: 12432 ft: 14492 corp: 20/784b lim: 50 exec/s: 30 rss: 74Mb L: 47/47 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:54.598 [2024-12-05 12:50:57.793876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.598 [2024-12-05 12:50:57.793902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.793947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.598 [2024-12-05 12:50:57.793963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.598 [2024-12-05 12:50:57.794020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.598 [2024-12-05 12:50:57.794036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.598 #31 NEW cov: 12432 ft: 14499 corp: 21/822b lim: 50 exec/s: 31 rss: 74Mb L: 38/47 MS: 1 InsertByte- 00:08:54.598 [2024-12-05 12:50:57.834136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.599 [2024-12-05 12:50:57.834163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.599 [2024-12-05 12:50:57.834208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.599 [2024-12-05 12:50:57.834223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.599 [2024-12-05 12:50:57.834279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.599 [2024-12-05 12:50:57.834295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.599 [2024-12-05 12:50:57.834351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.599 [2024-12-05 12:50:57.834366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.599 #32 NEW cov: 12432 ft: 14505 corp: 22/863b lim: 50 exec/s: 32 rss: 74Mb L: 41/47 MS: 1 ShuffleBytes- 00:08:54.599 [2024-12-05 12:50:57.894157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.599 [2024-12-05 12:50:57.894185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.599 [2024-12-05 12:50:57.894233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.599 [2024-12-05 12:50:57.894249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.599 [2024-12-05 12:50:57.894307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.599 [2024-12-05 12:50:57.894323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.878 #33 NEW cov: 12432 ft: 14520 corp: 23/901b lim: 50 exec/s: 33 rss: 74Mb L: 38/47 MS: 1 CopyPart- 00:08:54.878 [2024-12-05 12:50:57.954534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.878 [2024-12-05 12:50:57.954564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.878 [2024-12-05 12:50:57.954601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.878 [2024-12-05 12:50:57.954616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.878 [2024-12-05 12:50:57.954672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.878 [2024-12-05 12:50:57.954687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.878 [2024-12-05 12:50:57.954745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.878 [2024-12-05 12:50:57.954762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.878 #34 NEW cov: 12432 ft: 14562 corp: 24/943b lim: 50 exec/s: 34 rss: 74Mb L: 42/47 MS: 1 ShuffleBytes- 00:08:54.878 [2024-12-05 12:50:58.014399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.878 [2024-12-05 12:50:58.014429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.878 [2024-12-05 12:50:58.014475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.878 [2024-12-05 12:50:58.014491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.878 #35 NEW cov: 12432 ft: 14859 corp: 25/967b lim: 50 exec/s: 35 rss: 74Mb L: 24/47 MS: 1 CrossOver- 00:08:54.878 [2024-12-05 12:50:58.074813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.878 [2024-12-05 12:50:58.074846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.878 [2024-12-05 12:50:58.074889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.878 [2024-12-05 12:50:58.074905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.879 [2024-12-05 12:50:58.074957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.879 [2024-12-05 12:50:58.074973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.879 [2024-12-05 12:50:58.075026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.879 [2024-12-05 12:50:58.075041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.879 #36 NEW cov: 12432 ft: 14903 corp: 26/1009b lim: 50 exec/s: 36 rss: 74Mb L: 42/47 MS: 1 ShuffleBytes- 00:08:54.879 [2024-12-05 12:50:58.114740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.879 [2024-12-05 12:50:58.114768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.879 [2024-12-05 12:50:58.114803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.879 [2024-12-05 12:50:58.114822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.879 [2024-12-05 12:50:58.114878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.879 [2024-12-05 12:50:58.114894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.879 #37 NEW cov: 12432 ft: 14934 corp: 27/1047b lim: 50 exec/s: 37 rss: 74Mb L: 38/47 MS: 1 ShuffleBytes- 00:08:54.879 [2024-12-05 12:50:58.175088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:54.879 [2024-12-05 12:50:58.175116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.879 [2024-12-05 12:50:58.175152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:54.879 [2024-12-05 12:50:58.175168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.879 [2024-12-05 12:50:58.175223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:54.879 [2024-12-05 12:50:58.175239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.879 [2024-12-05 12:50:58.175294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:54.879 [2024-12-05 12:50:58.175310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.171 #38 NEW cov: 12432 ft: 14935 corp: 28/1089b lim: 50 exec/s: 38 rss: 74Mb L: 42/47 MS: 1 ChangeByte- 00:08:55.171 [2024-12-05 12:50:58.215321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.171 [2024-12-05 12:50:58.215349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.215394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.171 [2024-12-05 12:50:58.215410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.215466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.171 [2024-12-05 12:50:58.215482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.215536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:55.171 [2024-12-05 12:50:58.215550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.215606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:55.171 [2024-12-05 12:50:58.215621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:55.171 #39 NEW cov: 12432 ft: 15030 corp: 29/1139b lim: 50 exec/s: 39 rss: 74Mb L: 50/50 MS: 1 CopyPart- 00:08:55.171 [2024-12-05 12:50:58.275362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.171 [2024-12-05 12:50:58.275389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.275435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.171 [2024-12-05 12:50:58.275450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.275507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.171 [2024-12-05 12:50:58.275526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.275580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:55.171 [2024-12-05 12:50:58.275596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.171 #40 NEW cov: 12432 ft: 15041 corp: 30/1181b lim: 50 exec/s: 40 rss: 74Mb L: 42/50 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:55.171 [2024-12-05 12:50:58.315622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.171 [2024-12-05 12:50:58.315650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.315697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.171 [2024-12-05 12:50:58.315711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.315766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.171 [2024-12-05 12:50:58.315782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.315843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:55.171 [2024-12-05 12:50:58.315859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.315918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:55.171 [2024-12-05 12:50:58.315934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:55.171 #41 NEW cov: 12432 ft: 15046 corp: 31/1231b lim: 50 exec/s: 41 rss: 74Mb L: 50/50 MS: 1 PersAutoDict- DE: "\001\230\202\363\177\030u$"- 00:08:55.171 [2024-12-05 12:50:58.355557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.171 [2024-12-05 12:50:58.355584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.355630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.171 [2024-12-05 12:50:58.355645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.355703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.171 [2024-12-05 12:50:58.355719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.355775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:55.171 [2024-12-05 12:50:58.355791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.171 #42 NEW cov: 12432 ft: 15051 corp: 32/1273b lim: 50 exec/s: 42 rss: 74Mb L: 42/50 MS: 1 ChangeBinInt- 00:08:55.171 [2024-12-05 12:50:58.395677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.171 [2024-12-05 12:50:58.395705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.395743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.171 [2024-12-05 12:50:58.395760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.395817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.171 [2024-12-05 12:50:58.395838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.395893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:55.171 [2024-12-05 12:50:58.395906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.171 #43 NEW cov: 12432 ft: 15071 corp: 33/1318b lim: 50 exec/s: 43 rss: 74Mb L: 45/50 MS: 1 ShuffleBytes- 00:08:55.171 [2024-12-05 12:50:58.455896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.171 [2024-12-05 12:50:58.455923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.455966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.171 [2024-12-05 12:50:58.455981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.456038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.171 [2024-12-05 12:50:58.456054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.171 [2024-12-05 12:50:58.456111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:55.171 [2024-12-05 12:50:58.456128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:55.464 #44 NEW cov: 12432 ft: 15092 corp: 34/1360b lim: 50 exec/s: 44 rss: 74Mb L: 42/50 MS: 1 ShuffleBytes- 00:08:55.464 [2024-12-05 12:50:58.495809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:55.464 [2024-12-05 12:50:58.495843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.464 [2024-12-05 12:50:58.495890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:55.464 [2024-12-05 12:50:58.495906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.464 [2024-12-05 12:50:58.495961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:55.464 [2024-12-05 12:50:58.495978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:55.464 #45 NEW cov: 12432 ft: 15128 corp: 35/1398b lim: 50 exec/s: 22 rss: 74Mb L: 38/50 MS: 1 ChangeBit- 00:08:55.464 #45 DONE cov: 12432 ft: 15128 corp: 35/1398b lim: 50 exec/s: 22 rss: 74Mb 00:08:55.464 ###### Recommended dictionary. ###### 00:08:55.464 "\000\000\000\000" # Uses: 2 00:08:55.464 "\001\230\202\363\177\030u$" # Uses: 1 00:08:55.464 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:55.464 ###### End of recommended dictionary. ###### 00:08:55.464 Done 45 runs in 2 second(s) 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:55.464 12:50:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:55.464 [2024-12-05 12:50:58.663583] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:55.464 [2024-12-05 12:50:58.663651] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid158466 ] 00:08:55.725 [2024-12-05 12:50:58.860805] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.725 [2024-12-05 12:50:58.873544] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.725 [2024-12-05 12:50:58.925889] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:55.725 [2024-12-05 12:50:58.942215] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:55.725 INFO: Running with entropic power schedule (0xFF, 100). 00:08:55.725 INFO: Seed: 3169887906 00:08:55.725 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:55.725 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:55.725 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:55.725 INFO: A corpus is not provided, starting from an empty corpus 00:08:55.725 #2 INITED exec/s: 0 rss: 64Mb 00:08:55.725 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:55.725 This may also happen if the target rejected all inputs we tried so far 00:08:55.725 [2024-12-05 12:50:59.001267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:55.725 [2024-12-05 12:50:59.001297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.245 NEW_FUNC[1/718]: 0x479fd8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:56.245 NEW_FUNC[2/718]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:56.245 #4 NEW cov: 12232 ft: 12226 corp: 2/26b lim: 85 exec/s: 0 rss: 72Mb L: 25/25 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:56.245 [2024-12-05 12:50:59.362819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.245 [2024-12-05 12:50:59.362876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.245 #15 NEW cov: 12345 ft: 12977 corp: 3/51b lim: 85 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 CrossOver- 00:08:56.245 [2024-12-05 12:50:59.433008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.245 [2024-12-05 12:50:59.433035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.245 #21 NEW cov: 12351 ft: 13167 corp: 4/77b lim: 85 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 InsertByte- 00:08:56.245 [2024-12-05 12:50:59.473213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.245 [2024-12-05 12:50:59.473243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.245 [2024-12-05 12:50:59.473347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:56.245 [2024-12-05 12:50:59.473371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.245 #22 NEW cov: 12436 ft: 14231 corp: 5/114b lim: 85 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:56.245 [2024-12-05 12:50:59.513279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.245 [2024-12-05 12:50:59.513307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.245 #23 NEW cov: 12436 ft: 14374 corp: 6/139b lim: 85 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 ShuffleBytes- 00:08:56.245 [2024-12-05 12:50:59.553287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.246 [2024-12-05 12:50:59.553313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.506 #24 NEW cov: 12436 ft: 14447 corp: 7/165b lim: 85 exec/s: 0 rss: 72Mb L: 26/37 MS: 1 ChangeByte- 00:08:56.506 [2024-12-05 12:50:59.613559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.506 [2024-12-05 12:50:59.613584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.506 #25 NEW cov: 12436 ft: 14507 corp: 8/190b lim: 85 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 ChangeByte- 00:08:56.506 [2024-12-05 12:50:59.674139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.506 [2024-12-05 12:50:59.674170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.506 [2024-12-05 12:50:59.674271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:56.506 [2024-12-05 12:50:59.674291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.506 [2024-12-05 12:50:59.674400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:56.506 [2024-12-05 12:50:59.674423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:56.506 #26 NEW cov: 12436 ft: 14887 corp: 9/250b lim: 85 exec/s: 0 rss: 72Mb L: 60/60 MS: 1 InsertRepeatedBytes- 00:08:56.506 [2024-12-05 12:50:59.733865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.506 [2024-12-05 12:50:59.733894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.506 #27 NEW cov: 12436 ft: 14961 corp: 10/282b lim: 85 exec/s: 0 rss: 72Mb L: 32/60 MS: 1 InsertRepeatedBytes- 00:08:56.506 [2024-12-05 12:50:59.784206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.506 [2024-12-05 12:50:59.784236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.506 [2024-12-05 12:50:59.784338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:56.506 [2024-12-05 12:50:59.784362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:56.506 #28 NEW cov: 12436 ft: 15063 corp: 11/321b lim: 85 exec/s: 0 rss: 72Mb L: 39/60 MS: 1 InsertRepeatedBytes- 00:08:56.766 [2024-12-05 12:50:59.834155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.766 [2024-12-05 12:50:59.834187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.766 #29 NEW cov: 12436 ft: 15089 corp: 12/346b lim: 85 exec/s: 0 rss: 72Mb L: 25/60 MS: 1 ShuffleBytes- 00:08:56.766 [2024-12-05 12:50:59.874248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.766 [2024-12-05 12:50:59.874276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.766 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:56.766 #30 NEW cov: 12459 ft: 15128 corp: 13/372b lim: 85 exec/s: 0 rss: 72Mb L: 26/60 MS: 1 CopyPart- 00:08:56.766 [2024-12-05 12:50:59.914315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.766 [2024-12-05 12:50:59.914344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.766 #31 NEW cov: 12459 ft: 15167 corp: 14/404b lim: 85 exec/s: 0 rss: 73Mb L: 32/60 MS: 1 CrossOver- 00:08:56.766 [2024-12-05 12:50:59.974524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.766 [2024-12-05 12:50:59.974553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.766 #32 NEW cov: 12459 ft: 15180 corp: 15/429b lim: 85 exec/s: 32 rss: 73Mb L: 25/60 MS: 1 ChangeByte- 00:08:56.766 [2024-12-05 12:51:00.044994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:56.766 [2024-12-05 12:51:00.045028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:56.767 [2024-12-05 12:51:00.045144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:56.767 [2024-12-05 12:51:00.045169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.027 #33 NEW cov: 12459 ft: 15210 corp: 16/466b lim: 85 exec/s: 33 rss: 73Mb L: 37/60 MS: 1 ChangeBit- 00:08:57.027 [2024-12-05 12:51:00.115432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.027 [2024-12-05 12:51:00.115464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.027 [2024-12-05 12:51:00.115588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.027 [2024-12-05 12:51:00.115609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.027 [2024-12-05 12:51:00.115729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:57.027 [2024-12-05 12:51:00.115748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.027 #34 NEW cov: 12459 ft: 15240 corp: 17/526b lim: 85 exec/s: 34 rss: 73Mb L: 60/60 MS: 1 ChangeByte- 00:08:57.027 [2024-12-05 12:51:00.175313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.027 [2024-12-05 12:51:00.175343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.027 [2024-12-05 12:51:00.175469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.027 [2024-12-05 12:51:00.175494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.027 #35 NEW cov: 12459 ft: 15290 corp: 18/572b lim: 85 exec/s: 35 rss: 73Mb L: 46/60 MS: 1 CrossOver- 00:08:57.027 [2024-12-05 12:51:00.245558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.027 [2024-12-05 12:51:00.245590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.027 [2024-12-05 12:51:00.245719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.027 [2024-12-05 12:51:00.245741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.027 #36 NEW cov: 12459 ft: 15327 corp: 19/619b lim: 85 exec/s: 36 rss: 73Mb L: 47/60 MS: 1 InsertByte- 00:08:57.027 [2024-12-05 12:51:00.315967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.027 [2024-12-05 12:51:00.315999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.027 [2024-12-05 12:51:00.316107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.027 [2024-12-05 12:51:00.316128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.027 [2024-12-05 12:51:00.316251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:57.027 [2024-12-05 12:51:00.316273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.287 #37 NEW cov: 12459 ft: 15362 corp: 20/684b lim: 85 exec/s: 37 rss: 73Mb L: 65/65 MS: 1 InsertRepeatedBytes- 00:08:57.287 [2024-12-05 12:51:00.366380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.287 [2024-12-05 12:51:00.366413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.287 [2024-12-05 12:51:00.366488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.287 [2024-12-05 12:51:00.366513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.287 [2024-12-05 12:51:00.366634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:57.287 [2024-12-05 12:51:00.366656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.287 [2024-12-05 12:51:00.366778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:57.287 [2024-12-05 12:51:00.366798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:57.287 #38 NEW cov: 12459 ft: 15739 corp: 21/767b lim: 85 exec/s: 38 rss: 73Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:57.287 [2024-12-05 12:51:00.415758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.287 [2024-12-05 12:51:00.415788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.287 #39 NEW cov: 12459 ft: 15811 corp: 22/793b lim: 85 exec/s: 39 rss: 73Mb L: 26/83 MS: 1 ChangeByte- 00:08:57.287 [2024-12-05 12:51:00.475946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.287 [2024-12-05 12:51:00.475977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.287 #40 NEW cov: 12459 ft: 15815 corp: 23/819b lim: 85 exec/s: 40 rss: 73Mb L: 26/83 MS: 1 InsertByte- 00:08:57.287 [2024-12-05 12:51:00.526111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.287 [2024-12-05 12:51:00.526144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.287 #41 NEW cov: 12459 ft: 15887 corp: 24/845b lim: 85 exec/s: 41 rss: 73Mb L: 26/83 MS: 1 ChangeASCIIInt- 00:08:57.287 [2024-12-05 12:51:00.566493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.287 [2024-12-05 12:51:00.566523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.287 [2024-12-05 12:51:00.566640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.287 [2024-12-05 12:51:00.566662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.287 #44 NEW cov: 12459 ft: 15900 corp: 25/885b lim: 85 exec/s: 44 rss: 73Mb L: 40/83 MS: 3 ChangeBit-ChangeBit-CrossOver- 00:08:57.547 [2024-12-05 12:51:00.616592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.547 [2024-12-05 12:51:00.616626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.547 [2024-12-05 12:51:00.616735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.548 [2024-12-05 12:51:00.616758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.548 #45 NEW cov: 12459 ft: 15908 corp: 26/924b lim: 85 exec/s: 45 rss: 73Mb L: 39/83 MS: 1 CrossOver- 00:08:57.548 [2024-12-05 12:51:00.686590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.548 [2024-12-05 12:51:00.686623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.548 #46 NEW cov: 12459 ft: 15949 corp: 27/951b lim: 85 exec/s: 46 rss: 73Mb L: 27/83 MS: 1 InsertByte- 00:08:57.548 [2024-12-05 12:51:00.756962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.548 [2024-12-05 12:51:00.756988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.548 #47 NEW cov: 12459 ft: 16048 corp: 28/982b lim: 85 exec/s: 47 rss: 73Mb L: 31/83 MS: 1 InsertRepeatedBytes- 00:08:57.548 [2024-12-05 12:51:00.807332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.548 [2024-12-05 12:51:00.807362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.548 [2024-12-05 12:51:00.807468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.548 [2024-12-05 12:51:00.807492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.548 [2024-12-05 12:51:00.807624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:57.548 [2024-12-05 12:51:00.807645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:57.548 #48 NEW cov: 12459 ft: 16055 corp: 29/1049b lim: 85 exec/s: 48 rss: 73Mb L: 67/83 MS: 1 InsertRepeatedBytes- 00:08:57.548 [2024-12-05 12:51:00.847065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.548 [2024-12-05 12:51:00.847094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.809 #49 NEW cov: 12459 ft: 16065 corp: 30/1076b lim: 85 exec/s: 49 rss: 74Mb L: 27/83 MS: 1 ChangeBit- 00:08:57.809 [2024-12-05 12:51:00.917536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.809 [2024-12-05 12:51:00.917569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.809 [2024-12-05 12:51:00.917695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:57.809 [2024-12-05 12:51:00.917717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:57.809 #50 NEW cov: 12459 ft: 16070 corp: 31/1113b lim: 85 exec/s: 50 rss: 74Mb L: 37/83 MS: 1 CopyPart- 00:08:57.809 [2024-12-05 12:51:00.967349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:57.809 [2024-12-05 12:51:00.967377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:57.809 #51 NEW cov: 12459 ft: 16080 corp: 32/1138b lim: 85 exec/s: 25 rss: 74Mb L: 25/83 MS: 1 ChangeByte- 00:08:57.809 #51 DONE cov: 12459 ft: 16080 corp: 32/1138b lim: 85 exec/s: 25 rss: 74Mb 00:08:57.809 Done 51 runs in 2 second(s) 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:57.809 12:51:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:58.070 [2024-12-05 12:51:01.133265] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:08:58.070 [2024-12-05 12:51:01.133358] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid158852 ] 00:08:58.070 [2024-12-05 12:51:01.332819] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.070 [2024-12-05 12:51:01.345406] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.329 [2024-12-05 12:51:01.397918] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:58.329 [2024-12-05 12:51:01.414241] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:58.329 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.329 INFO: Seed: 1348931996 00:08:58.329 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:08:58.329 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:08:58.329 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:58.329 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.329 #2 INITED exec/s: 0 rss: 64Mb 00:08:58.329 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.329 This may also happen if the target rejected all inputs we tried so far 00:08:58.329 [2024-12-05 12:51:01.480344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:58.329 [2024-12-05 12:51:01.480386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.329 [2024-12-05 12:51:01.480509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:58.329 [2024-12-05 12:51:01.480528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:58.589 NEW_FUNC[1/717]: 0x47d218 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:58.589 NEW_FUNC[2/717]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:58.589 #13 NEW cov: 12165 ft: 12154 corp: 2/13b lim: 25 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:08:58.589 [2024-12-05 12:51:01.831411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:58.589 [2024-12-05 12:51:01.831456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.589 [2024-12-05 12:51:01.831593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:58.589 [2024-12-05 12:51:01.831619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:58.589 [2024-12-05 12:51:01.831753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:58.589 [2024-12-05 12:51:01.831778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:58.589 #18 NEW cov: 12278 ft: 13119 corp: 3/32b lim: 25 exec/s: 0 rss: 72Mb L: 19/19 MS: 5 CrossOver-InsertByte-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:58.589 [2024-12-05 12:51:01.891475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:58.589 [2024-12-05 12:51:01.891509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.589 [2024-12-05 12:51:01.891606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:58.589 [2024-12-05 12:51:01.891626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:58.589 [2024-12-05 12:51:01.891749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:58.589 [2024-12-05 12:51:01.891773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:58.848 #19 NEW cov: 12284 ft: 13208 corp: 4/51b lim: 25 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ShuffleBytes- 00:08:58.848 [2024-12-05 12:51:01.961572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:58.848 [2024-12-05 12:51:01.961598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.848 [2024-12-05 12:51:01.961714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:58.848 [2024-12-05 12:51:01.961756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:58.848 #20 NEW cov: 12369 ft: 13542 corp: 5/61b lim: 25 exec/s: 0 rss: 72Mb L: 10/19 MS: 1 EraseBytes- 00:08:58.848 [2024-12-05 12:51:02.011522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:58.848 [2024-12-05 12:51:02.011548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.848 #21 NEW cov: 12369 ft: 13951 corp: 6/69b lim: 25 exec/s: 0 rss: 72Mb L: 8/19 MS: 1 EraseBytes- 00:08:58.848 [2024-12-05 12:51:02.081974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:58.848 [2024-12-05 12:51:02.082004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.848 [2024-12-05 12:51:02.082143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:58.848 [2024-12-05 12:51:02.082165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:58.848 #22 NEW cov: 12369 ft: 14028 corp: 7/81b lim: 25 exec/s: 0 rss: 72Mb L: 12/19 MS: 1 CrossOver- 00:08:58.848 [2024-12-05 12:51:02.132158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:58.848 [2024-12-05 12:51:02.132194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:58.848 [2024-12-05 12:51:02.132314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:58.848 [2024-12-05 12:51:02.132335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:58.848 [2024-12-05 12:51:02.132463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:58.848 [2024-12-05 12:51:02.132487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.107 #23 NEW cov: 12369 ft: 14109 corp: 8/100b lim: 25 exec/s: 0 rss: 73Mb L: 19/19 MS: 1 CopyPart- 00:08:59.107 [2024-12-05 12:51:02.202541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.107 [2024-12-05 12:51:02.202573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.107 [2024-12-05 12:51:02.202698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.107 [2024-12-05 12:51:02.202719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.107 [2024-12-05 12:51:02.202844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:59.107 [2024-12-05 12:51:02.202865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.107 [2024-12-05 12:51:02.202989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:59.107 [2024-12-05 12:51:02.203012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:59.107 #24 NEW cov: 12369 ft: 14588 corp: 9/120b lim: 25 exec/s: 0 rss: 73Mb L: 20/20 MS: 1 InsertByte- 00:08:59.107 [2024-12-05 12:51:02.252294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.107 [2024-12-05 12:51:02.252328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.107 [2024-12-05 12:51:02.252450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.107 [2024-12-05 12:51:02.252474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.107 #25 NEW cov: 12369 ft: 14682 corp: 10/133b lim: 25 exec/s: 0 rss: 73Mb L: 13/20 MS: 1 CrossOver- 00:08:59.107 [2024-12-05 12:51:02.322482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.107 [2024-12-05 12:51:02.322515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.107 [2024-12-05 12:51:02.322664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.107 [2024-12-05 12:51:02.322685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.107 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:59.107 #26 NEW cov: 12392 ft: 14744 corp: 11/143b lim: 25 exec/s: 0 rss: 73Mb L: 10/20 MS: 1 EraseBytes- 00:08:59.107 [2024-12-05 12:51:02.392794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.107 [2024-12-05 12:51:02.392820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.107 [2024-12-05 12:51:02.392958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.107 [2024-12-05 12:51:02.392981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.366 #27 NEW cov: 12392 ft: 14789 corp: 12/154b lim: 25 exec/s: 0 rss: 73Mb L: 11/20 MS: 1 InsertByte- 00:08:59.366 [2024-12-05 12:51:02.463185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.366 [2024-12-05 12:51:02.463216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.366 [2024-12-05 12:51:02.463335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.366 [2024-12-05 12:51:02.463356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.366 [2024-12-05 12:51:02.463484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:59.366 [2024-12-05 12:51:02.463507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.366 #28 NEW cov: 12392 ft: 14795 corp: 13/173b lim: 25 exec/s: 28 rss: 73Mb L: 19/20 MS: 1 ChangeBit- 00:08:59.366 [2024-12-05 12:51:02.513154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.366 [2024-12-05 12:51:02.513184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.366 [2024-12-05 12:51:02.513307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.366 [2024-12-05 12:51:02.513330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.366 #29 NEW cov: 12392 ft: 14815 corp: 14/186b lim: 25 exec/s: 29 rss: 73Mb L: 13/20 MS: 1 EraseBytes- 00:08:59.366 [2024-12-05 12:51:02.563323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.366 [2024-12-05 12:51:02.563356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.366 [2024-12-05 12:51:02.563497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.366 [2024-12-05 12:51:02.563519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.366 #30 NEW cov: 12392 ft: 14865 corp: 15/196b lim: 25 exec/s: 30 rss: 73Mb L: 10/20 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\004"- 00:08:59.366 [2024-12-05 12:51:02.613392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.366 [2024-12-05 12:51:02.613420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.366 #31 NEW cov: 12392 ft: 14866 corp: 16/204b lim: 25 exec/s: 31 rss: 73Mb L: 8/20 MS: 1 ChangeByte- 00:08:59.625 [2024-12-05 12:51:02.683750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.625 [2024-12-05 12:51:02.683777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.625 [2024-12-05 12:51:02.683911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.625 [2024-12-05 12:51:02.683930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.625 #32 NEW cov: 12392 ft: 14900 corp: 17/218b lim: 25 exec/s: 32 rss: 73Mb L: 14/20 MS: 1 InsertByte- 00:08:59.625 [2024-12-05 12:51:02.753764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.625 [2024-12-05 12:51:02.753790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.625 #33 NEW cov: 12392 ft: 14936 corp: 18/224b lim: 25 exec/s: 33 rss: 73Mb L: 6/20 MS: 1 EraseBytes- 00:08:59.625 [2024-12-05 12:51:02.804076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.625 [2024-12-05 12:51:02.804110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.625 [2024-12-05 12:51:02.804236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.625 [2024-12-05 12:51:02.804262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.625 #34 NEW cov: 12392 ft: 14962 corp: 19/234b lim: 25 exec/s: 34 rss: 73Mb L: 10/20 MS: 1 ChangeBit- 00:08:59.625 [2024-12-05 12:51:02.874457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.625 [2024-12-05 12:51:02.874492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.625 [2024-12-05 12:51:02.874612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.625 [2024-12-05 12:51:02.874635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.625 [2024-12-05 12:51:02.874766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:59.625 [2024-12-05 12:51:02.874790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.625 #35 NEW cov: 12392 ft: 14973 corp: 20/253b lim: 25 exec/s: 35 rss: 73Mb L: 19/20 MS: 1 ChangeBinInt- 00:08:59.884 [2024-12-05 12:51:02.944355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.884 [2024-12-05 12:51:02.944381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.884 #36 NEW cov: 12392 ft: 15005 corp: 21/258b lim: 25 exec/s: 36 rss: 73Mb L: 5/20 MS: 1 EraseBytes- 00:08:59.884 [2024-12-05 12:51:02.994423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.884 [2024-12-05 12:51:02.994455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.884 #37 NEW cov: 12392 ft: 15012 corp: 22/267b lim: 25 exec/s: 37 rss: 74Mb L: 9/20 MS: 1 InsertRepeatedBytes- 00:08:59.884 [2024-12-05 12:51:03.064781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.884 [2024-12-05 12:51:03.064816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.884 [2024-12-05 12:51:03.064938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.884 [2024-12-05 12:51:03.064959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.884 #38 NEW cov: 12392 ft: 15022 corp: 23/281b lim: 25 exec/s: 38 rss: 74Mb L: 14/20 MS: 1 ChangeBinInt- 00:08:59.884 [2024-12-05 12:51:03.135451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.884 [2024-12-05 12:51:03.135484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.884 [2024-12-05 12:51:03.135582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.884 [2024-12-05 12:51:03.135603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.884 [2024-12-05 12:51:03.135734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:59.884 [2024-12-05 12:51:03.135756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.884 [2024-12-05 12:51:03.135900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:59.884 [2024-12-05 12:51:03.135924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:59.884 #39 NEW cov: 12392 ft: 15056 corp: 24/301b lim: 25 exec/s: 39 rss: 74Mb L: 20/20 MS: 1 ChangeBinInt- 00:08:59.884 [2024-12-05 12:51:03.185547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:59.884 [2024-12-05 12:51:03.185576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:59.884 [2024-12-05 12:51:03.185667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:59.884 [2024-12-05 12:51:03.185689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:59.884 [2024-12-05 12:51:03.185808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:59.884 [2024-12-05 12:51:03.185827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:59.884 [2024-12-05 12:51:03.185965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:59.884 [2024-12-05 12:51:03.185986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:00.143 #40 NEW cov: 12392 ft: 15064 corp: 25/324b lim: 25 exec/s: 40 rss: 74Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:09:00.143 [2024-12-05 12:51:03.235294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.143 [2024-12-05 12:51:03.235320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.143 [2024-12-05 12:51:03.305381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.143 [2024-12-05 12:51:03.305413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.143 #42 NEW cov: 12392 ft: 15135 corp: 26/333b lim: 25 exec/s: 42 rss: 74Mb L: 9/23 MS: 2 ChangeBit-ShuffleBytes- 00:09:00.143 [2024-12-05 12:51:03.355689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.143 [2024-12-05 12:51:03.355724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.143 [2024-12-05 12:51:03.355848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.143 [2024-12-05 12:51:03.355887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.143 #43 NEW cov: 12392 ft: 15150 corp: 27/346b lim: 25 exec/s: 43 rss: 74Mb L: 13/23 MS: 1 InsertByte- 00:09:00.143 [2024-12-05 12:51:03.405949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.143 [2024-12-05 12:51:03.405982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.143 [2024-12-05 12:51:03.406107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.143 [2024-12-05 12:51:03.406130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.143 #44 NEW cov: 12392 ft: 15166 corp: 28/359b lim: 25 exec/s: 44 rss: 74Mb L: 13/23 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\004"- 00:09:00.143 [2024-12-05 12:51:03.455994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:00.143 [2024-12-05 12:51:03.456028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.143 [2024-12-05 12:51:03.456154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:00.143 [2024-12-05 12:51:03.456176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.402 #45 NEW cov: 12392 ft: 15168 corp: 29/372b lim: 25 exec/s: 22 rss: 74Mb L: 13/23 MS: 1 ChangeByte- 00:09:00.402 #45 DONE cov: 12392 ft: 15168 corp: 29/372b lim: 25 exec/s: 22 rss: 74Mb 00:09:00.402 ###### Recommended dictionary. ###### 00:09:00.402 "\000\000\000\000\000\000\000\004" # Uses: 1 00:09:00.402 ###### End of recommended dictionary. ###### 00:09:00.402 Done 45 runs in 2 second(s) 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:00.402 12:51:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:09:00.402 [2024-12-05 12:51:03.643109] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:09:00.402 [2024-12-05 12:51:03.643181] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid159409 ] 00:09:00.660 [2024-12-05 12:51:03.836581] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.660 [2024-12-05 12:51:03.849221] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.660 [2024-12-05 12:51:03.901578] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:00.660 [2024-12-05 12:51:03.917926] tcp.c:1100:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:09:00.660 INFO: Running with entropic power schedule (0xFF, 100). 00:09:00.660 INFO: Seed: 3851909479 00:09:00.660 INFO: Loaded 1 modules (389724 inline 8-bit counters): 389724 [0x2abc04c, 0x2b1b2a8), 00:09:00.660 INFO: Loaded 1 PC tables (389724 PCs): 389724 [0x2b1b2a8,0x310d868), 00:09:00.660 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:00.660 INFO: A corpus is not provided, starting from an empty corpus 00:09:00.660 #2 INITED exec/s: 0 rss: 65Mb 00:09:00.660 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:00.660 This may also happen if the target rejected all inputs we tried so far 00:09:00.918 [2024-12-05 12:51:03.984246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.918 [2024-12-05 12:51:03.984281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:00.918 [2024-12-05 12:51:03.984400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.918 [2024-12-05 12:51:03.984426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:00.919 [2024-12-05 12:51:03.984540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.919 [2024-12-05 12:51:03.984565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.177 NEW_FUNC[1/718]: 0x47e308 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:09:01.177 NEW_FUNC[2/718]: 0x48ef88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:01.177 #9 NEW cov: 12237 ft: 12237 corp: 2/61b lim: 100 exec/s: 0 rss: 73Mb L: 60/60 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:09:01.177 [2024-12-05 12:51:04.335231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.177 [2024-12-05 12:51:04.335281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.177 [2024-12-05 12:51:04.335402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.177 [2024-12-05 12:51:04.335428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.177 [2024-12-05 12:51:04.335543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.177 [2024-12-05 12:51:04.335567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.177 #10 NEW cov: 12350 ft: 12892 corp: 3/121b lim: 100 exec/s: 0 rss: 73Mb L: 60/60 MS: 1 ChangeByte- 00:09:01.177 [2024-12-05 12:51:04.405172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.177 [2024-12-05 12:51:04.405199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.177 [2024-12-05 12:51:04.405318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.177 [2024-12-05 12:51:04.405341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.177 #11 NEW cov: 12356 ft: 13326 corp: 4/166b lim: 100 exec/s: 0 rss: 73Mb L: 45/60 MS: 1 EraseBytes- 00:09:01.177 [2024-12-05 12:51:04.454989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.177 [2024-12-05 12:51:04.455016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.177 #12 NEW cov: 12441 ft: 14284 corp: 5/200b lim: 100 exec/s: 0 rss: 73Mb L: 34/60 MS: 1 InsertRepeatedBytes- 00:09:01.437 [2024-12-05 12:51:04.505736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.505769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.505897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:33554432 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.505921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.506056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.506076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.437 #13 NEW cov: 12441 ft: 14328 corp: 6/260b lim: 100 exec/s: 0 rss: 73Mb L: 60/60 MS: 1 ChangeBinInt- 00:09:01.437 [2024-12-05 12:51:04.575652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.575687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.575808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.575827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.437 #14 NEW cov: 12441 ft: 14425 corp: 7/300b lim: 100 exec/s: 0 rss: 73Mb L: 40/60 MS: 1 EraseBytes- 00:09:01.437 [2024-12-05 12:51:04.646166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.646198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.646319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.646342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.646464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.646486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.437 #15 NEW cov: 12441 ft: 14522 corp: 8/366b lim: 100 exec/s: 0 rss: 73Mb L: 66/66 MS: 1 InsertRepeatedBytes- 00:09:01.437 [2024-12-05 12:51:04.696157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.696191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.696297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2969567232 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.696320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.696438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.696461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.437 #16 NEW cov: 12441 ft: 14547 corp: 9/426b lim: 100 exec/s: 0 rss: 73Mb L: 60/66 MS: 1 ChangeByte- 00:09:01.437 [2024-12-05 12:51:04.746418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.746449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.746563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.746583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.437 [2024-12-05 12:51:04.746701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.437 [2024-12-05 12:51:04.746725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.696 #22 NEW cov: 12441 ft: 14656 corp: 10/486b lim: 100 exec/s: 0 rss: 73Mb L: 60/66 MS: 1 ChangeBit- 00:09:01.696 [2024-12-05 12:51:04.786249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.786275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.696 [2024-12-05 12:51:04.786402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.786422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.696 #23 NEW cov: 12441 ft: 14709 corp: 11/538b lim: 100 exec/s: 0 rss: 73Mb L: 52/66 MS: 1 CopyPart- 00:09:01.696 [2024-12-05 12:51:04.856447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.856480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.696 [2024-12-05 12:51:04.856598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.856617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.696 NEW_FUNC[1/1]: 0x1c60bc8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:01.696 #29 NEW cov: 12464 ft: 14760 corp: 12/591b lim: 100 exec/s: 0 rss: 73Mb L: 53/66 MS: 1 CrossOver- 00:09:01.696 [2024-12-05 12:51:04.906840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.906874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.696 [2024-12-05 12:51:04.906999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.907016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.696 [2024-12-05 12:51:04.907130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:28 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.907153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.696 #30 NEW cov: 12464 ft: 14804 corp: 13/651b lim: 100 exec/s: 0 rss: 73Mb L: 60/66 MS: 1 ChangeByte- 00:09:01.696 [2024-12-05 12:51:04.946666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.946697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.696 [2024-12-05 12:51:04.946815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:04.946840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.696 #31 NEW cov: 12464 ft: 14834 corp: 14/691b lim: 100 exec/s: 31 rss: 74Mb L: 40/66 MS: 1 CrossOver- 00:09:01.696 [2024-12-05 12:51:05.007186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:05.007217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.696 [2024-12-05 12:51:05.007338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:05.007356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.696 [2024-12-05 12:51:05.007485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.696 [2024-12-05 12:51:05.007507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.955 #32 NEW cov: 12464 ft: 14869 corp: 15/761b lim: 100 exec/s: 32 rss: 74Mb L: 70/70 MS: 1 CrossOver- 00:09:01.955 [2024-12-05 12:51:05.067536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:189854918901760 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.067568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.955 [2024-12-05 12:51:05.067664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.067684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.955 [2024-12-05 12:51:05.067807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.067830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.955 [2024-12-05 12:51:05.067957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.067981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:01.955 #35 NEW cov: 12464 ft: 15262 corp: 16/848b lim: 100 exec/s: 35 rss: 74Mb L: 87/87 MS: 3 ChangeBit-InsertRepeatedBytes-InsertRepeatedBytes- 00:09:01.955 [2024-12-05 12:51:05.107310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.107341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.955 [2024-12-05 12:51:05.107470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.107494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.955 [2024-12-05 12:51:05.107611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.107634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:01.955 #41 NEW cov: 12464 ft: 15268 corp: 17/919b lim: 100 exec/s: 41 rss: 74Mb L: 71/87 MS: 1 CopyPart- 00:09:01.955 [2024-12-05 12:51:05.157254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.157288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.955 [2024-12-05 12:51:05.157406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.157439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.955 #42 NEW cov: 12464 ft: 15314 corp: 18/959b lim: 100 exec/s: 42 rss: 74Mb L: 40/87 MS: 1 ChangeByte- 00:09:01.955 [2024-12-05 12:51:05.197396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.197422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.955 [2024-12-05 12:51:05.197544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:281474976710656 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.197563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:01.955 #43 NEW cov: 12464 ft: 15343 corp: 19/1011b lim: 100 exec/s: 43 rss: 74Mb L: 52/87 MS: 1 ChangeBit- 00:09:01.955 [2024-12-05 12:51:05.267624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.267655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:01.955 [2024-12-05 12:51:05.267777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.955 [2024-12-05 12:51:05.267796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.213 #44 NEW cov: 12464 ft: 15367 corp: 20/1051b lim: 100 exec/s: 44 rss: 74Mb L: 40/87 MS: 1 CMP- DE: "U\364\016th\177\000\000"- 00:09:02.213 [2024-12-05 12:51:05.317836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.317867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.213 [2024-12-05 12:51:05.318000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.318025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.213 #45 NEW cov: 12464 ft: 15395 corp: 21/1091b lim: 100 exec/s: 45 rss: 74Mb L: 40/87 MS: 1 ChangeByte- 00:09:02.213 [2024-12-05 12:51:05.388515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.388545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.213 [2024-12-05 12:51:05.388628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:10706345580035347604 len:38037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.388650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.213 [2024-12-05 12:51:05.388765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:10706345580035347604 len:38037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.388788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.213 [2024-12-05 12:51:05.388904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:10706345580035347604 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.388926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:02.213 #46 NEW cov: 12464 ft: 15420 corp: 22/1185b lim: 100 exec/s: 46 rss: 74Mb L: 94/94 MS: 1 InsertRepeatedBytes- 00:09:02.213 [2024-12-05 12:51:05.437849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.437880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.213 #47 NEW cov: 12464 ft: 15473 corp: 23/1219b lim: 100 exec/s: 47 rss: 74Mb L: 34/94 MS: 1 CopyPart- 00:09:02.213 [2024-12-05 12:51:05.508610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.508641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.213 [2024-12-05 12:51:05.508728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:33554432 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.508746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.213 [2024-12-05 12:51:05.508876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.213 [2024-12-05 12:51:05.508895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.471 #48 NEW cov: 12464 ft: 15484 corp: 24/1279b lim: 100 exec/s: 48 rss: 74Mb L: 60/94 MS: 1 ChangeBit- 00:09:02.471 [2024-12-05 12:51:05.578245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.471 [2024-12-05 12:51:05.578277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.471 #49 NEW cov: 12464 ft: 15562 corp: 25/1314b lim: 100 exec/s: 49 rss: 74Mb L: 35/94 MS: 1 InsertByte- 00:09:02.471 [2024-12-05 12:51:05.639000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.471 [2024-12-05 12:51:05.639029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.471 [2024-12-05 12:51:05.639140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.471 [2024-12-05 12:51:05.639162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.471 [2024-12-05 12:51:05.639282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.471 [2024-12-05 12:51:05.639302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.471 #50 NEW cov: 12464 ft: 15637 corp: 26/1382b lim: 100 exec/s: 50 rss: 74Mb L: 68/94 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\002"- 00:09:02.471 [2024-12-05 12:51:05.709177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.471 [2024-12-05 12:51:05.709209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.471 [2024-12-05 12:51:05.709300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.471 [2024-12-05 12:51:05.709320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.471 [2024-12-05 12:51:05.709443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.471 [2024-12-05 12:51:05.709467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.471 #51 NEW cov: 12464 ft: 15659 corp: 27/1448b lim: 100 exec/s: 51 rss: 74Mb L: 66/94 MS: 1 CrossOver- 00:09:02.471 [2024-12-05 12:51:05.778928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.471 [2024-12-05 12:51:05.778954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.730 #52 NEW cov: 12464 ft: 15725 corp: 28/1483b lim: 100 exec/s: 52 rss: 74Mb L: 35/94 MS: 1 EraseBytes- 00:09:02.730 [2024-12-05 12:51:05.829502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:288230376151711744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.731 [2024-12-05 12:51:05.829534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.731 [2024-12-05 12:51:05.829640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2969567232 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.731 [2024-12-05 12:51:05.829661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.731 [2024-12-05 12:51:05.829781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.731 [2024-12-05 12:51:05.829805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:02.731 #53 NEW cov: 12464 ft: 15729 corp: 29/1543b lim: 100 exec/s: 53 rss: 75Mb L: 60/94 MS: 1 ChangeBit- 00:09:02.731 [2024-12-05 12:51:05.899524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.731 [2024-12-05 12:51:05.899558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.731 [2024-12-05 12:51:05.899676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.731 [2024-12-05 12:51:05.899697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.731 #54 NEW cov: 12464 ft: 15744 corp: 30/1584b lim: 100 exec/s: 54 rss: 75Mb L: 41/94 MS: 1 InsertByte- 00:09:02.731 [2024-12-05 12:51:05.949719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.731 [2024-12-05 12:51:05.949747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:02.731 [2024-12-05 12:51:05.949860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:9729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:02.731 [2024-12-05 12:51:05.949884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:02.731 #55 NEW cov: 12464 ft: 15772 corp: 31/1627b lim: 100 exec/s: 27 rss: 75Mb L: 43/94 MS: 1 CrossOver- 00:09:02.731 #55 DONE cov: 12464 ft: 15772 corp: 31/1627b lim: 100 exec/s: 27 rss: 75Mb 00:09:02.731 ###### Recommended dictionary. ###### 00:09:02.731 "U\364\016th\177\000\000" # Uses: 0 00:09:02.731 "\000\000\000\000\000\000\000\002" # Uses: 0 00:09:02.731 ###### End of recommended dictionary. ###### 00:09:02.731 Done 55 runs in 2 second(s) 00:09:02.991 12:51:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:09:02.991 12:51:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:02.991 12:51:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:02.991 12:51:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:09:02.991 00:09:02.991 real 1m3.063s 00:09:02.991 user 1m39.382s 00:09:02.991 sys 0m7.510s 00:09:02.991 12:51:06 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:02.991 12:51:06 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:02.991 ************************************ 00:09:02.991 END TEST nvmf_llvm_fuzz 00:09:02.991 ************************************ 00:09:02.991 12:51:06 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:09:02.991 12:51:06 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:09:02.991 12:51:06 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:02.991 12:51:06 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:09:02.991 12:51:06 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:02.991 12:51:06 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:02.991 ************************************ 00:09:02.991 START TEST vfio_llvm_fuzz 00:09:02.991 ************************************ 00:09:02.991 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:02.991 * Looking for test storage... 00:09:02.991 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:02.991 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:09:02.991 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1711 -- # lcov --version 00:09:02.991 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:09:03.254 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.254 --rc genhtml_branch_coverage=1 00:09:03.254 --rc genhtml_function_coverage=1 00:09:03.254 --rc genhtml_legend=1 00:09:03.254 --rc geninfo_all_blocks=1 00:09:03.254 --rc geninfo_unexecuted_blocks=1 00:09:03.254 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.254 ' 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:09:03.254 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.254 --rc genhtml_branch_coverage=1 00:09:03.254 --rc genhtml_function_coverage=1 00:09:03.254 --rc genhtml_legend=1 00:09:03.254 --rc geninfo_all_blocks=1 00:09:03.254 --rc geninfo_unexecuted_blocks=1 00:09:03.254 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.254 ' 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:09:03.254 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.254 --rc genhtml_branch_coverage=1 00:09:03.254 --rc genhtml_function_coverage=1 00:09:03.254 --rc genhtml_legend=1 00:09:03.254 --rc geninfo_all_blocks=1 00:09:03.254 --rc geninfo_unexecuted_blocks=1 00:09:03.254 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.254 ' 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:09:03.254 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.254 --rc genhtml_branch_coverage=1 00:09:03.254 --rc genhtml_function_coverage=1 00:09:03.254 --rc genhtml_legend=1 00:09:03.254 --rc geninfo_all_blocks=1 00:09:03.254 --rc geninfo_unexecuted_blocks=1 00:09:03.254 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.254 ' 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:09:03.254 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:09:03.255 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:09:03.255 #define SPDK_CONFIG_H 00:09:03.255 #define SPDK_CONFIG_AIO_FSDEV 1 00:09:03.255 #define SPDK_CONFIG_APPS 1 00:09:03.255 #define SPDK_CONFIG_ARCH native 00:09:03.255 #undef SPDK_CONFIG_ASAN 00:09:03.255 #undef SPDK_CONFIG_AVAHI 00:09:03.255 #undef SPDK_CONFIG_CET 00:09:03.255 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:09:03.255 #define SPDK_CONFIG_COVERAGE 1 00:09:03.255 #define SPDK_CONFIG_CROSS_PREFIX 00:09:03.255 #undef SPDK_CONFIG_CRYPTO 00:09:03.255 #undef SPDK_CONFIG_CRYPTO_MLX5 00:09:03.255 #undef SPDK_CONFIG_CUSTOMOCF 00:09:03.255 #undef SPDK_CONFIG_DAOS 00:09:03.255 #define SPDK_CONFIG_DAOS_DIR 00:09:03.255 #define SPDK_CONFIG_DEBUG 1 00:09:03.255 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:09:03.255 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:03.255 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:03.255 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:03.255 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:09:03.255 #undef SPDK_CONFIG_DPDK_UADK 00:09:03.255 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:03.255 #define SPDK_CONFIG_EXAMPLES 1 00:09:03.255 #undef SPDK_CONFIG_FC 00:09:03.255 #define SPDK_CONFIG_FC_PATH 00:09:03.255 #define SPDK_CONFIG_FIO_PLUGIN 1 00:09:03.255 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:09:03.255 #define SPDK_CONFIG_FSDEV 1 00:09:03.255 #undef SPDK_CONFIG_FUSE 00:09:03.255 #define SPDK_CONFIG_FUZZER 1 00:09:03.255 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:09:03.255 #undef SPDK_CONFIG_GOLANG 00:09:03.255 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:09:03.255 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:09:03.255 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:09:03.255 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:09:03.255 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:09:03.255 #undef SPDK_CONFIG_HAVE_LIBBSD 00:09:03.255 #undef SPDK_CONFIG_HAVE_LZ4 00:09:03.255 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:09:03.255 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:09:03.255 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:09:03.255 #define SPDK_CONFIG_IDXD 1 00:09:03.255 #define SPDK_CONFIG_IDXD_KERNEL 1 00:09:03.255 #undef SPDK_CONFIG_IPSEC_MB 00:09:03.255 #define SPDK_CONFIG_IPSEC_MB_DIR 00:09:03.255 #define SPDK_CONFIG_ISAL 1 00:09:03.255 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:09:03.255 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:09:03.255 #define SPDK_CONFIG_LIBDIR 00:09:03.255 #undef SPDK_CONFIG_LTO 00:09:03.255 #define SPDK_CONFIG_MAX_LCORES 128 00:09:03.255 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:09:03.255 #define SPDK_CONFIG_NVME_CUSE 1 00:09:03.255 #undef SPDK_CONFIG_OCF 00:09:03.255 #define SPDK_CONFIG_OCF_PATH 00:09:03.255 #define SPDK_CONFIG_OPENSSL_PATH 00:09:03.255 #undef SPDK_CONFIG_PGO_CAPTURE 00:09:03.255 #define SPDK_CONFIG_PGO_DIR 00:09:03.255 #undef SPDK_CONFIG_PGO_USE 00:09:03.255 #define SPDK_CONFIG_PREFIX /usr/local 00:09:03.255 #undef SPDK_CONFIG_RAID5F 00:09:03.255 #undef SPDK_CONFIG_RBD 00:09:03.255 #define SPDK_CONFIG_RDMA 1 00:09:03.255 #define SPDK_CONFIG_RDMA_PROV verbs 00:09:03.255 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:09:03.255 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:09:03.255 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:09:03.255 #undef SPDK_CONFIG_SHARED 00:09:03.255 #undef SPDK_CONFIG_SMA 00:09:03.255 #define SPDK_CONFIG_TESTS 1 00:09:03.255 #undef SPDK_CONFIG_TSAN 00:09:03.255 #define SPDK_CONFIG_UBLK 1 00:09:03.255 #define SPDK_CONFIG_UBSAN 1 00:09:03.255 #undef SPDK_CONFIG_UNIT_TESTS 00:09:03.255 #undef SPDK_CONFIG_URING 00:09:03.255 #define SPDK_CONFIG_URING_PATH 00:09:03.255 #undef SPDK_CONFIG_URING_ZNS 00:09:03.255 #undef SPDK_CONFIG_USDT 00:09:03.255 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:09:03.256 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:09:03.256 #define SPDK_CONFIG_VFIO_USER 1 00:09:03.256 #define SPDK_CONFIG_VFIO_USER_DIR 00:09:03.256 #define SPDK_CONFIG_VHOST 1 00:09:03.256 #define SPDK_CONFIG_VIRTIO 1 00:09:03.256 #undef SPDK_CONFIG_VTUNE 00:09:03.256 #define SPDK_CONFIG_VTUNE_DIR 00:09:03.256 #define SPDK_CONFIG_WERROR 1 00:09:03.256 #define SPDK_CONFIG_WPDK_DIR 00:09:03.256 #undef SPDK_CONFIG_XNVME 00:09:03.256 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:09:03.256 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:09:03.257 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 160409 ]] 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 160409 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1696 -- # set_test_storage 2147483648 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.SpE3WI 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.SpE3WI/tests/vfio /tmp/spdk.SpE3WI 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=52716027904 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730582528 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=9014554624 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30861860864 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865289216 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340121600 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346118144 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5996544 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30864977920 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865293312 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=315392 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:09:03.258 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:09:03.259 * Looking for test storage... 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=52716027904 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=11229147136 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.259 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1698 -- # set -o errtrace 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1699 -- # shopt -s extdebug 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1700 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1702 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1703 -- # true 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1705 -- # xtrace_fd 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1711 -- # lcov --version 00:09:03.259 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:09:03.520 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.520 --rc genhtml_branch_coverage=1 00:09:03.520 --rc genhtml_function_coverage=1 00:09:03.520 --rc genhtml_legend=1 00:09:03.520 --rc geninfo_all_blocks=1 00:09:03.520 --rc geninfo_unexecuted_blocks=1 00:09:03.520 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.520 ' 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:09:03.520 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.520 --rc genhtml_branch_coverage=1 00:09:03.520 --rc genhtml_function_coverage=1 00:09:03.520 --rc genhtml_legend=1 00:09:03.520 --rc geninfo_all_blocks=1 00:09:03.520 --rc geninfo_unexecuted_blocks=1 00:09:03.520 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.520 ' 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:09:03.520 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.520 --rc genhtml_branch_coverage=1 00:09:03.520 --rc genhtml_function_coverage=1 00:09:03.520 --rc genhtml_legend=1 00:09:03.520 --rc geninfo_all_blocks=1 00:09:03.520 --rc geninfo_unexecuted_blocks=1 00:09:03.520 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.520 ' 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:09:03.520 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:03.520 --rc genhtml_branch_coverage=1 00:09:03.520 --rc genhtml_function_coverage=1 00:09:03.520 --rc genhtml_legend=1 00:09:03.520 --rc geninfo_all_blocks=1 00:09:03.520 --rc geninfo_unexecuted_blocks=1 00:09:03.520 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:03.520 ' 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:09:03.520 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:03.520 12:51:06 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:09:03.520 [2024-12-05 12:51:06.684405] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:09:03.520 [2024-12-05 12:51:06.684491] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid160464 ] 00:09:03.520 [2024-12-05 12:51:06.781932] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.520 [2024-12-05 12:51:06.808094] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.780 INFO: Running with entropic power schedule (0xFF, 100). 00:09:03.780 INFO: Seed: 2621953078 00:09:03.780 INFO: Loaded 1 modules (386960 inline 8-bit counters): 386960 [0x2a7d84c, 0x2adbfdc), 00:09:03.780 INFO: Loaded 1 PC tables (386960 PCs): 386960 [0x2adbfe0,0x30c38e0), 00:09:03.780 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:03.780 INFO: A corpus is not provided, starting from an empty corpus 00:09:03.780 #2 INITED exec/s: 0 rss: 66Mb 00:09:03.780 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:03.780 This may also happen if the target rejected all inputs we tried so far 00:09:03.780 [2024-12-05 12:51:07.058199] vfio_user.c:2873:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:09:04.328 NEW_FUNC[1/676]: 0x4521c8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:09:04.328 NEW_FUNC[2/676]: 0x457cd8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:04.328 #17 NEW cov: 11244 ft: 11212 corp: 2/7b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 5 InsertByte-ShuffleBytes-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:09:04.588 #23 NEW cov: 11261 ft: 14192 corp: 3/13b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:09:04.588 NEW_FUNC[1/1]: 0x1c2d018 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:04.588 #24 NEW cov: 11278 ft: 14994 corp: 4/19b lim: 6 exec/s: 0 rss: 75Mb L: 6/6 MS: 1 ChangeBit- 00:09:04.848 #25 NEW cov: 11285 ft: 16768 corp: 5/25b lim: 6 exec/s: 25 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:05.107 #26 NEW cov: 11285 ft: 16967 corp: 6/31b lim: 6 exec/s: 26 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:05.367 #27 NEW cov: 11285 ft: 17710 corp: 7/37b lim: 6 exec/s: 27 rss: 75Mb L: 6/6 MS: 1 ShuffleBytes- 00:09:05.367 #28 NEW cov: 11285 ft: 18147 corp: 8/43b lim: 6 exec/s: 28 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:09:05.627 #29 NEW cov: 11292 ft: 18268 corp: 9/49b lim: 6 exec/s: 29 rss: 75Mb L: 6/6 MS: 1 ChangeBit- 00:09:05.887 #30 NEW cov: 11292 ft: 18958 corp: 10/55b lim: 6 exec/s: 30 rss: 75Mb L: 6/6 MS: 1 CopyPart- 00:09:05.887 #42 NEW cov: 11292 ft: 18995 corp: 11/61b lim: 6 exec/s: 21 rss: 75Mb L: 6/6 MS: 2 EraseBytes-CopyPart- 00:09:05.887 #42 DONE cov: 11292 ft: 18995 corp: 11/61b lim: 6 exec/s: 21 rss: 75Mb 00:09:05.887 Done 42 runs in 2 second(s) 00:09:05.887 [2024-12-05 12:51:09.189015] vfio_user.c:2835:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:09:06.147 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:06.147 12:51:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:09:06.147 [2024-12-05 12:51:09.448699] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:09:06.147 [2024-12-05 12:51:09.448782] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid161004 ] 00:09:06.407 [2024-12-05 12:51:09.544664] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.407 [2024-12-05 12:51:09.566824] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.666 INFO: Running with entropic power schedule (0xFF, 100). 00:09:06.666 INFO: Seed: 1075982803 00:09:06.666 INFO: Loaded 1 modules (386960 inline 8-bit counters): 386960 [0x2a7d84c, 0x2adbfdc), 00:09:06.666 INFO: Loaded 1 PC tables (386960 PCs): 386960 [0x2adbfe0,0x30c38e0), 00:09:06.666 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:06.666 INFO: A corpus is not provided, starting from an empty corpus 00:09:06.666 #2 INITED exec/s: 0 rss: 66Mb 00:09:06.666 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:06.667 This may also happen if the target rejected all inputs we tried so far 00:09:06.667 [2024-12-05 12:51:09.801803] vfio_user.c:2873:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:09:06.667 [2024-12-05 12:51:09.874139] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:06.667 [2024-12-05 12:51:09.874166] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:06.667 [2024-12-05 12:51:09.874184] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:07.187 NEW_FUNC[1/678]: 0x452768 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:09:07.187 NEW_FUNC[2/678]: 0x457cd8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:07.187 #6 NEW cov: 11244 ft: 11040 corp: 2/5b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 4 InsertByte-CrossOver-CrossOver-InsertByte- 00:09:07.187 [2024-12-05 12:51:10.362032] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.187 [2024-12-05 12:51:10.362071] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.187 [2024-12-05 12:51:10.362089] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:07.187 #12 NEW cov: 11258 ft: 14560 corp: 3/9b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 ChangeBinInt- 00:09:07.447 [2024-12-05 12:51:10.545440] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.447 [2024-12-05 12:51:10.545466] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.447 [2024-12-05 12:51:10.545484] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:07.447 NEW_FUNC[1/1]: 0x1c2d018 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:07.447 #13 NEW cov: 11275 ft: 15922 corp: 4/13b lim: 4 exec/s: 0 rss: 75Mb L: 4/4 MS: 1 ShuffleBytes- 00:09:07.447 [2024-12-05 12:51:10.726007] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.447 [2024-12-05 12:51:10.726030] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.447 [2024-12-05 12:51:10.726048] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:07.707 #14 NEW cov: 11275 ft: 16373 corp: 5/17b lim: 4 exec/s: 14 rss: 75Mb L: 4/4 MS: 1 ChangeBinInt- 00:09:07.707 [2024-12-05 12:51:10.893207] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.707 [2024-12-05 12:51:10.893229] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.707 [2024-12-05 12:51:10.893247] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:07.707 #20 NEW cov: 11275 ft: 17185 corp: 6/21b lim: 4 exec/s: 20 rss: 75Mb L: 4/4 MS: 1 ChangeByte- 00:09:07.968 [2024-12-05 12:51:11.065283] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.968 [2024-12-05 12:51:11.065306] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.968 [2024-12-05 12:51:11.065322] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:07.968 #21 NEW cov: 11275 ft: 17308 corp: 7/25b lim: 4 exec/s: 21 rss: 75Mb L: 4/4 MS: 1 CopyPart- 00:09:07.968 [2024-12-05 12:51:11.240368] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:07.968 [2024-12-05 12:51:11.240389] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:07.968 [2024-12-05 12:51:11.240406] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.228 #22 NEW cov: 11275 ft: 17785 corp: 8/29b lim: 4 exec/s: 22 rss: 75Mb L: 4/4 MS: 1 CrossOver- 00:09:08.228 [2024-12-05 12:51:11.409704] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:08.228 [2024-12-05 12:51:11.409730] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:08.228 [2024-12-05 12:51:11.409747] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.228 #23 NEW cov: 11275 ft: 17851 corp: 9/33b lim: 4 exec/s: 23 rss: 75Mb L: 4/4 MS: 1 ChangeBinInt- 00:09:08.488 [2024-12-05 12:51:11.585061] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:08.488 [2024-12-05 12:51:11.585084] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:08.488 [2024-12-05 12:51:11.585101] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.488 #27 NEW cov: 11282 ft: 18053 corp: 10/37b lim: 4 exec/s: 27 rss: 75Mb L: 4/4 MS: 4 EraseBytes-EraseBytes-CMP-InsertByte- DE: "\002\000"- 00:09:08.488 [2024-12-05 12:51:11.752661] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:08.488 [2024-12-05 12:51:11.752683] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:08.488 [2024-12-05 12:51:11.752700] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:08.748 #28 NEW cov: 11282 ft: 18382 corp: 11/41b lim: 4 exec/s: 14 rss: 75Mb L: 4/4 MS: 1 CopyPart- 00:09:08.748 #28 DONE cov: 11282 ft: 18382 corp: 11/41b lim: 4 exec/s: 14 rss: 75Mb 00:09:08.748 ###### Recommended dictionary. ###### 00:09:08.748 "\002\000" # Uses: 0 00:09:08.748 ###### End of recommended dictionary. ###### 00:09:08.748 Done 28 runs in 2 second(s) 00:09:08.748 [2024-12-05 12:51:11.873031] vfio_user.c:2835:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:09.009 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:09.009 12:51:12 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:09.009 [2024-12-05 12:51:12.136942] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:09:09.009 [2024-12-05 12:51:12.137014] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid161422 ] 00:09:09.009 [2024-12-05 12:51:12.232899] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.009 [2024-12-05 12:51:12.254941] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.269 INFO: Running with entropic power schedule (0xFF, 100). 00:09:09.269 INFO: Seed: 3770991425 00:09:09.269 INFO: Loaded 1 modules (386960 inline 8-bit counters): 386960 [0x2a7d84c, 0x2adbfdc), 00:09:09.269 INFO: Loaded 1 PC tables (386960 PCs): 386960 [0x2adbfe0,0x30c38e0), 00:09:09.269 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:09.269 INFO: A corpus is not provided, starting from an empty corpus 00:09:09.269 #2 INITED exec/s: 0 rss: 66Mb 00:09:09.269 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:09.269 This may also happen if the target rejected all inputs we tried so far 00:09:09.269 [2024-12-05 12:51:12.497710] vfio_user.c:2873:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:09:09.269 [2024-12-05 12:51:12.566103] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:09.790 NEW_FUNC[1/677]: 0x453158 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:09:09.790 NEW_FUNC[2/677]: 0x457cd8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:09.790 #14 NEW cov: 11224 ft: 10723 corp: 2/9b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:09:09.790 [2024-12-05 12:51:13.037856] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:10.050 #15 NEW cov: 11238 ft: 14031 corp: 3/17b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ChangeBit- 00:09:10.050 [2024-12-05 12:51:13.222957] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:10.050 NEW_FUNC[1/1]: 0x1c2d018 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:10.050 #16 NEW cov: 11255 ft: 14913 corp: 4/25b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:10.309 [2024-12-05 12:51:13.406895] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:10.310 #17 NEW cov: 11258 ft: 15439 corp: 5/33b lim: 8 exec/s: 17 rss: 74Mb L: 8/8 MS: 1 CopyPart- 00:09:10.310 [2024-12-05 12:51:13.590023] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:10.570 #18 NEW cov: 11258 ft: 15494 corp: 6/41b lim: 8 exec/s: 18 rss: 74Mb L: 8/8 MS: 1 CrossOver- 00:09:10.570 [2024-12-05 12:51:13.775875] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:10.831 #19 NEW cov: 11258 ft: 16250 corp: 7/49b lim: 8 exec/s: 19 rss: 74Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:10.831 [2024-12-05 12:51:13.960950] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:10.831 #20 NEW cov: 11258 ft: 16345 corp: 8/57b lim: 8 exec/s: 20 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:09:11.090 [2024-12-05 12:51:14.148142] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:11.090 #31 NEW cov: 11258 ft: 16512 corp: 9/65b lim: 8 exec/s: 31 rss: 74Mb L: 8/8 MS: 1 ChangeBit- 00:09:11.090 [2024-12-05 12:51:14.333702] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:11.348 #32 NEW cov: 11265 ft: 16548 corp: 10/73b lim: 8 exec/s: 32 rss: 74Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:11.348 [2024-12-05 12:51:14.519786] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:11.348 #43 NEW cov: 11265 ft: 17037 corp: 11/81b lim: 8 exec/s: 21 rss: 74Mb L: 8/8 MS: 1 CopyPart- 00:09:11.348 #43 DONE cov: 11265 ft: 17037 corp: 11/81b lim: 8 exec/s: 21 rss: 74Mb 00:09:11.348 Done 43 runs in 2 second(s) 00:09:11.348 [2024-12-05 12:51:14.648019] vfio_user.c:2835:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:11.607 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:11.607 12:51:14 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:11.607 [2024-12-05 12:51:14.902856] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:09:11.607 [2024-12-05 12:51:14.902933] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid161828 ] 00:09:11.866 [2024-12-05 12:51:14.998632] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.866 [2024-12-05 12:51:15.020390] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.124 INFO: Running with entropic power schedule (0xFF, 100). 00:09:12.124 INFO: Seed: 2239031477 00:09:12.124 INFO: Loaded 1 modules (386960 inline 8-bit counters): 386960 [0x2a7d84c, 0x2adbfdc), 00:09:12.124 INFO: Loaded 1 PC tables (386960 PCs): 386960 [0x2adbfe0,0x30c38e0), 00:09:12.124 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:12.124 INFO: A corpus is not provided, starting from an empty corpus 00:09:12.124 #2 INITED exec/s: 0 rss: 66Mb 00:09:12.124 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:12.124 This may also happen if the target rejected all inputs we tried so far 00:09:12.124 [2024-12-05 12:51:15.272012] vfio_user.c:2873:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:12.641 NEW_FUNC[1/677]: 0x453848 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:12.641 NEW_FUNC[2/677]: 0x457cd8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:12.641 #226 NEW cov: 11233 ft: 11191 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 ChangeBit-InsertRepeatedBytes-ChangeBit-CopyPart- 00:09:12.641 #227 NEW cov: 11249 ft: 14065 corp: 3/65b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:12.900 NEW_FUNC[1/1]: 0x1c2d018 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:12.900 #228 NEW cov: 11266 ft: 14904 corp: 4/97b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:13.158 #229 NEW cov: 11266 ft: 15814 corp: 5/129b lim: 32 exec/s: 229 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:13.417 #230 NEW cov: 11266 ft: 15875 corp: 6/161b lim: 32 exec/s: 230 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:13.417 #231 NEW cov: 11266 ft: 16455 corp: 7/193b lim: 32 exec/s: 231 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:09:13.675 #232 NEW cov: 11266 ft: 17175 corp: 8/225b lim: 32 exec/s: 232 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:13.934 #233 NEW cov: 11273 ft: 17829 corp: 9/257b lim: 32 exec/s: 233 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:14.192 #234 NEW cov: 11273 ft: 18197 corp: 10/289b lim: 32 exec/s: 117 rss: 75Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:14.192 #234 DONE cov: 11273 ft: 18197 corp: 10/289b lim: 32 exec/s: 117 rss: 75Mb 00:09:14.192 Done 234 runs in 2 second(s) 00:09:14.192 [2024-12-05 12:51:17.267017] vfio_user.c:2835:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:14.192 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:14.192 12:51:17 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:14.451 [2024-12-05 12:51:17.524854] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:09:14.451 [2024-12-05 12:51:17.524929] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid162369 ] 00:09:14.451 [2024-12-05 12:51:17.596525] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.451 [2024-12-05 12:51:17.618464] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.710 INFO: Running with entropic power schedule (0xFF, 100). 00:09:14.710 INFO: Seed: 541054397 00:09:14.710 INFO: Loaded 1 modules (386960 inline 8-bit counters): 386960 [0x2a7d84c, 0x2adbfdc), 00:09:14.710 INFO: Loaded 1 PC tables (386960 PCs): 386960 [0x2adbfe0,0x30c38e0), 00:09:14.710 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:14.711 INFO: A corpus is not provided, starting from an empty corpus 00:09:14.711 #2 INITED exec/s: 0 rss: 67Mb 00:09:14.711 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:14.711 This may also happen if the target rejected all inputs we tried so far 00:09:14.711 [2024-12-05 12:51:17.857170] vfio_user.c:2873:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:15.229 NEW_FUNC[1/677]: 0x4540c8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:15.229 NEW_FUNC[2/677]: 0x457cd8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:15.229 #83 NEW cov: 11234 ft: 11202 corp: 2/33b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:09:15.229 #94 NEW cov: 11251 ft: 14255 corp: 3/65b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:15.488 NEW_FUNC[1/1]: 0x1c2d018 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:15.488 #100 NEW cov: 11268 ft: 15433 corp: 4/97b lim: 32 exec/s: 0 rss: 76Mb L: 32/32 MS: 1 CMP- DE: "\352\015\000\000"- 00:09:15.746 #101 NEW cov: 11268 ft: 15803 corp: 5/129b lim: 32 exec/s: 101 rss: 76Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:16.006 #102 NEW cov: 11268 ft: 16239 corp: 6/161b lim: 32 exec/s: 102 rss: 76Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:16.006 #103 NEW cov: 11268 ft: 16282 corp: 7/193b lim: 32 exec/s: 103 rss: 76Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:16.265 #104 NEW cov: 11268 ft: 16660 corp: 8/225b lim: 32 exec/s: 104 rss: 76Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:16.524 #108 NEW cov: 11275 ft: 16999 corp: 9/257b lim: 32 exec/s: 108 rss: 76Mb L: 32/32 MS: 4 EraseBytes-InsertRepeatedBytes-PersAutoDict-CopyPart- DE: "\352\015\000\000"- 00:09:16.524 #109 NEW cov: 11275 ft: 17053 corp: 10/289b lim: 32 exec/s: 109 rss: 77Mb L: 32/32 MS: 1 CopyPart- 00:09:16.784 #110 NEW cov: 11275 ft: 17267 corp: 11/321b lim: 32 exec/s: 55 rss: 77Mb L: 32/32 MS: 1 CrossOver- 00:09:16.784 #110 DONE cov: 11275 ft: 17267 corp: 11/321b lim: 32 exec/s: 55 rss: 77Mb 00:09:16.784 ###### Recommended dictionary. ###### 00:09:16.784 "\352\015\000\000" # Uses: 1 00:09:16.784 ###### End of recommended dictionary. ###### 00:09:16.784 Done 110 runs in 2 second(s) 00:09:16.784 [2024-12-05 12:51:20.012029] vfio_user.c:2835:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:17.045 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:17.045 12:51:20 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:17.045 [2024-12-05 12:51:20.279270] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:09:17.045 [2024-12-05 12:51:20.279362] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid162899 ] 00:09:17.306 [2024-12-05 12:51:20.374607] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.306 [2024-12-05 12:51:20.397062] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:17.306 INFO: Running with entropic power schedule (0xFF, 100). 00:09:17.306 INFO: Seed: 3324062639 00:09:17.306 INFO: Loaded 1 modules (386960 inline 8-bit counters): 386960 [0x2a7d84c, 0x2adbfdc), 00:09:17.306 INFO: Loaded 1 PC tables (386960 PCs): 386960 [0x2adbfe0,0x30c38e0), 00:09:17.306 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:17.306 INFO: A corpus is not provided, starting from an empty corpus 00:09:17.306 #2 INITED exec/s: 0 rss: 67Mb 00:09:17.306 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:17.306 This may also happen if the target rejected all inputs we tried so far 00:09:17.566 [2024-12-05 12:51:20.641042] vfio_user.c:2873:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:17.566 [2024-12-05 12:51:20.709085] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:17.566 [2024-12-05 12:51:20.709124] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:17.825 NEW_FUNC[1/678]: 0x454ac8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:17.825 NEW_FUNC[2/678]: 0x457cd8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:17.825 #76 NEW cov: 11246 ft: 10735 corp: 2/14b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 4 InsertRepeatedBytes-InsertRepeatedBytes-ChangeBinInt-InsertByte- 00:09:18.085 [2024-12-05 12:51:21.193047] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.086 [2024-12-05 12:51:21.193089] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:18.086 #77 NEW cov: 11260 ft: 14117 corp: 3/27b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 CopyPart- 00:09:18.086 [2024-12-05 12:51:21.374966] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.086 [2024-12-05 12:51:21.374998] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:18.346 NEW_FUNC[1/1]: 0x1c2d018 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:18.346 #78 NEW cov: 11277 ft: 16187 corp: 4/40b lim: 13 exec/s: 0 rss: 75Mb L: 13/13 MS: 1 CrossOver- 00:09:18.346 [2024-12-05 12:51:21.573099] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.346 [2024-12-05 12:51:21.573132] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:18.606 #79 NEW cov: 11277 ft: 16990 corp: 5/53b lim: 13 exec/s: 79 rss: 75Mb L: 13/13 MS: 1 CrossOver- 00:09:18.606 [2024-12-05 12:51:21.748850] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.606 [2024-12-05 12:51:21.748881] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:18.606 #80 NEW cov: 11277 ft: 17467 corp: 6/66b lim: 13 exec/s: 80 rss: 75Mb L: 13/13 MS: 1 CopyPart- 00:09:18.866 [2024-12-05 12:51:21.948026] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.866 [2024-12-05 12:51:21.948057] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:18.866 #81 NEW cov: 11277 ft: 17682 corp: 7/79b lim: 13 exec/s: 81 rss: 76Mb L: 13/13 MS: 1 ChangeByte- 00:09:18.866 [2024-12-05 12:51:22.125739] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:18.866 [2024-12-05 12:51:22.125770] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.126 #82 NEW cov: 11277 ft: 18010 corp: 8/92b lim: 13 exec/s: 82 rss: 76Mb L: 13/13 MS: 1 ChangeBit- 00:09:19.126 [2024-12-05 12:51:22.310032] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:19.126 [2024-12-05 12:51:22.310062] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.126 #88 NEW cov: 11284 ft: 18102 corp: 9/105b lim: 13 exec/s: 88 rss: 76Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:19.386 [2024-12-05 12:51:22.486553] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:19.386 [2024-12-05 12:51:22.486584] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.386 #89 NEW cov: 11284 ft: 18318 corp: 10/118b lim: 13 exec/s: 89 rss: 76Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:09:19.386 [2024-12-05 12:51:22.678198] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:19.386 [2024-12-05 12:51:22.678229] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:19.646 #90 NEW cov: 11284 ft: 18628 corp: 11/131b lim: 13 exec/s: 45 rss: 76Mb L: 13/13 MS: 1 CMP- DE: "\001\001"- 00:09:19.646 #90 DONE cov: 11284 ft: 18628 corp: 11/131b lim: 13 exec/s: 45 rss: 76Mb 00:09:19.646 ###### Recommended dictionary. ###### 00:09:19.646 "\001\001" # Uses: 0 00:09:19.646 ###### End of recommended dictionary. ###### 00:09:19.646 Done 90 runs in 2 second(s) 00:09:19.646 [2024-12-05 12:51:22.804028] vfio_user.c:2835:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:19.906 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:19.906 12:51:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:19.906 [2024-12-05 12:51:23.066039] Starting SPDK v25.01-pre git sha1 8d3947977 / DPDK 22.11.4 initialization... 00:09:19.906 [2024-12-05 12:51:23.066115] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid163352 ] 00:09:19.906 [2024-12-05 12:51:23.162828] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:19.906 [2024-12-05 12:51:23.185676] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.166 INFO: Running with entropic power schedule (0xFF, 100). 00:09:20.166 INFO: Seed: 1806090545 00:09:20.166 INFO: Loaded 1 modules (386960 inline 8-bit counters): 386960 [0x2a7d84c, 0x2adbfdc), 00:09:20.166 INFO: Loaded 1 PC tables (386960 PCs): 386960 [0x2adbfe0,0x30c38e0), 00:09:20.166 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:20.166 INFO: A corpus is not provided, starting from an empty corpus 00:09:20.166 #2 INITED exec/s: 0 rss: 66Mb 00:09:20.166 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:20.166 This may also happen if the target rejected all inputs we tried so far 00:09:20.166 [2024-12-05 12:51:23.417227] vfio_user.c:2873:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:20.166 [2024-12-05 12:51:23.444873] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:20.166 [2024-12-05 12:51:23.444909] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:20.688 NEW_FUNC[1/678]: 0x4557b8 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:20.688 NEW_FUNC[2/678]: 0x457cd8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:20.688 #17 NEW cov: 11235 ft: 11209 corp: 2/10b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 5 ChangeByte-ChangeBinInt-InsertRepeatedBytes-InsertByte-InsertRepeatedBytes- 00:09:20.688 [2024-12-05 12:51:23.872797] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:20.688 [2024-12-05 12:51:23.872838] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:20.688 #18 NEW cov: 11249 ft: 15025 corp: 3/19b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 ShuffleBytes- 00:09:20.688 [2024-12-05 12:51:23.994864] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:20.688 [2024-12-05 12:51:23.994900] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:20.949 #19 NEW cov: 11252 ft: 15345 corp: 4/28b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 1 CrossOver- 00:09:20.949 [2024-12-05 12:51:24.116964] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:20.949 [2024-12-05 12:51:24.117000] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:20.949 #20 NEW cov: 11252 ft: 15649 corp: 5/37b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 1 CopyPart- 00:09:20.949 [2024-12-05 12:51:24.230026] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:20.949 [2024-12-05 12:51:24.230067] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.209 NEW_FUNC[1/1]: 0x1c2d018 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:21.209 #31 NEW cov: 11269 ft: 16608 corp: 6/46b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:21.209 [2024-12-05 12:51:24.353026] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.209 [2024-12-05 12:51:24.353062] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.209 #32 NEW cov: 11269 ft: 16965 corp: 7/55b lim: 9 exec/s: 32 rss: 75Mb L: 9/9 MS: 1 ChangeByte- 00:09:21.209 [2024-12-05 12:51:24.477281] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.209 [2024-12-05 12:51:24.477319] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.469 #33 NEW cov: 11269 ft: 17065 corp: 8/64b lim: 9 exec/s: 33 rss: 76Mb L: 9/9 MS: 1 CrossOver- 00:09:21.469 [2024-12-05 12:51:24.601360] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.469 [2024-12-05 12:51:24.601396] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.469 #44 NEW cov: 11269 ft: 17118 corp: 9/73b lim: 9 exec/s: 44 rss: 76Mb L: 9/9 MS: 1 ShuffleBytes- 00:09:21.469 [2024-12-05 12:51:24.724515] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.469 [2024-12-05 12:51:24.724551] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.729 #50 NEW cov: 11269 ft: 17647 corp: 10/82b lim: 9 exec/s: 50 rss: 76Mb L: 9/9 MS: 1 ChangeBit- 00:09:21.729 [2024-12-05 12:51:24.847640] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.729 [2024-12-05 12:51:24.847674] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.729 #51 NEW cov: 11269 ft: 17763 corp: 11/91b lim: 9 exec/s: 51 rss: 76Mb L: 9/9 MS: 1 CrossOver- 00:09:21.729 [2024-12-05 12:51:24.959682] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.729 [2024-12-05 12:51:24.959718] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.729 #52 NEW cov: 11269 ft: 17858 corp: 12/100b lim: 9 exec/s: 52 rss: 76Mb L: 9/9 MS: 1 ChangeByte- 00:09:21.988 [2024-12-05 12:51:25.081715] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.988 [2024-12-05 12:51:25.081752] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.988 #53 NEW cov: 11269 ft: 17905 corp: 13/109b lim: 9 exec/s: 53 rss: 76Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:21.988 [2024-12-05 12:51:25.192722] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:21.988 [2024-12-05 12:51:25.192758] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:21.988 #54 NEW cov: 11276 ft: 18205 corp: 14/118b lim: 9 exec/s: 54 rss: 76Mb L: 9/9 MS: 1 CopyPart- 00:09:22.247 [2024-12-05 12:51:25.314937] vfio_user.c:3143:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:22.247 [2024-12-05 12:51:25.314972] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:22.247 #55 NEW cov: 11276 ft: 18220 corp: 15/127b lim: 9 exec/s: 27 rss: 76Mb L: 9/9 MS: 1 ChangeByte- 00:09:22.247 #55 DONE cov: 11276 ft: 18220 corp: 15/127b lim: 9 exec/s: 27 rss: 76Mb 00:09:22.247 Done 55 runs in 2 second(s) 00:09:22.247 [2024-12-05 12:51:25.409016] vfio_user.c:2835:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:22.506 12:51:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:22.506 12:51:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:22.506 12:51:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:22.506 12:51:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:22.506 00:09:22.506 real 0m19.460s 00:09:22.506 user 0m27.329s 00:09:22.506 sys 0m1.916s 00:09:22.506 12:51:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:22.506 12:51:25 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:22.506 ************************************ 00:09:22.506 END TEST vfio_llvm_fuzz 00:09:22.506 ************************************ 00:09:22.506 00:09:22.506 real 1m22.891s 00:09:22.506 user 2m6.876s 00:09:22.506 sys 0m9.661s 00:09:22.506 12:51:25 llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:22.506 12:51:25 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:22.506 ************************************ 00:09:22.506 END TEST llvm_fuzz 00:09:22.506 ************************************ 00:09:22.506 12:51:25 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:09:22.506 12:51:25 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:09:22.506 12:51:25 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:09:22.506 12:51:25 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:22.506 12:51:25 -- common/autotest_common.sh@10 -- # set +x 00:09:22.506 12:51:25 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:09:22.506 12:51:25 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:09:22.506 12:51:25 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:09:22.506 12:51:25 -- common/autotest_common.sh@10 -- # set +x 00:09:29.076 INFO: APP EXITING 00:09:29.076 INFO: killing all VMs 00:09:29.076 INFO: killing vhost app 00:09:29.076 INFO: EXIT DONE 00:09:32.369 Waiting for block devices as requested 00:09:32.369 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:32.628 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:32.628 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:32.628 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:32.888 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:32.888 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:32.888 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:33.149 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:33.149 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:33.149 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:33.409 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:33.409 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:33.409 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:33.669 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:33.669 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:33.669 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:33.929 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:38.129 Cleaning 00:09:38.129 Removing: /dev/shm/spdk_tgt_trace.pid135434 00:09:38.129 Removing: /var/run/dpdk/spdk_pid132972 00:09:38.130 Removing: /var/run/dpdk/spdk_pid134115 00:09:38.130 Removing: /var/run/dpdk/spdk_pid135434 00:09:38.130 Removing: /var/run/dpdk/spdk_pid135890 00:09:38.130 Removing: /var/run/dpdk/spdk_pid136978 00:09:38.130 Removing: /var/run/dpdk/spdk_pid137020 00:09:38.130 Removing: /var/run/dpdk/spdk_pid138106 00:09:38.130 Removing: /var/run/dpdk/spdk_pid138119 00:09:38.130 Removing: /var/run/dpdk/spdk_pid138554 00:09:38.130 Removing: /var/run/dpdk/spdk_pid138877 00:09:38.130 Removing: /var/run/dpdk/spdk_pid139198 00:09:38.130 Removing: /var/run/dpdk/spdk_pid139541 00:09:38.130 Removing: /var/run/dpdk/spdk_pid139698 00:09:38.130 Removing: /var/run/dpdk/spdk_pid139907 00:09:38.130 Removing: /var/run/dpdk/spdk_pid140189 00:09:38.130 Removing: /var/run/dpdk/spdk_pid140509 00:09:38.130 Removing: /var/run/dpdk/spdk_pid141357 00:09:38.130 Removing: /var/run/dpdk/spdk_pid144501 00:09:38.130 Removing: /var/run/dpdk/spdk_pid144731 00:09:38.130 Removing: /var/run/dpdk/spdk_pid144903 00:09:38.130 Removing: /var/run/dpdk/spdk_pid145065 00:09:38.130 Removing: /var/run/dpdk/spdk_pid145430 00:09:38.130 Removing: /var/run/dpdk/spdk_pid145581 00:09:38.130 Removing: /var/run/dpdk/spdk_pid145991 00:09:38.130 Removing: /var/run/dpdk/spdk_pid146043 00:09:38.130 Removing: /var/run/dpdk/spdk_pid146499 00:09:38.130 Removing: /var/run/dpdk/spdk_pid146558 00:09:38.130 Removing: /var/run/dpdk/spdk_pid146758 00:09:38.130 Removing: /var/run/dpdk/spdk_pid146857 00:09:38.130 Removing: /var/run/dpdk/spdk_pid147263 00:09:38.130 Removing: /var/run/dpdk/spdk_pid147531 00:09:38.130 Removing: /var/run/dpdk/spdk_pid147811 00:09:38.130 Removing: /var/run/dpdk/spdk_pid148087 00:09:38.130 Removing: /var/run/dpdk/spdk_pid148644 00:09:38.130 Removing: /var/run/dpdk/spdk_pid149179 00:09:38.130 Removing: /var/run/dpdk/spdk_pid149487 00:09:38.130 Removing: /var/run/dpdk/spdk_pid149994 00:09:38.130 Removing: /var/run/dpdk/spdk_pid150467 00:09:38.130 Removing: /var/run/dpdk/spdk_pid150830 00:09:38.130 Removing: /var/run/dpdk/spdk_pid151365 00:09:38.130 Removing: /var/run/dpdk/spdk_pid151757 00:09:38.130 Removing: /var/run/dpdk/spdk_pid152186 00:09:38.130 Removing: /var/run/dpdk/spdk_pid152722 00:09:38.130 Removing: /var/run/dpdk/spdk_pid153011 00:09:38.130 Removing: /var/run/dpdk/spdk_pid153541 00:09:38.130 Removing: /var/run/dpdk/spdk_pid154004 00:09:38.130 Removing: /var/run/dpdk/spdk_pid154362 00:09:38.130 Removing: /var/run/dpdk/spdk_pid154896 00:09:38.130 Removing: /var/run/dpdk/spdk_pid155277 00:09:38.130 Removing: /var/run/dpdk/spdk_pid155715 00:09:38.130 Removing: /var/run/dpdk/spdk_pid156243 00:09:38.130 Removing: /var/run/dpdk/spdk_pid156537 00:09:38.130 Removing: /var/run/dpdk/spdk_pid157072 00:09:38.130 Removing: /var/run/dpdk/spdk_pid157528 00:09:38.130 Removing: /var/run/dpdk/spdk_pid157919 00:09:38.130 Removing: /var/run/dpdk/spdk_pid158466 00:09:38.130 Removing: /var/run/dpdk/spdk_pid158852 00:09:38.130 Removing: /var/run/dpdk/spdk_pid159409 00:09:38.130 Removing: /var/run/dpdk/spdk_pid160464 00:09:38.130 Removing: /var/run/dpdk/spdk_pid161004 00:09:38.130 Removing: /var/run/dpdk/spdk_pid161422 00:09:38.130 Removing: /var/run/dpdk/spdk_pid161828 00:09:38.130 Removing: /var/run/dpdk/spdk_pid162369 00:09:38.130 Removing: /var/run/dpdk/spdk_pid162899 00:09:38.130 Removing: /var/run/dpdk/spdk_pid163352 00:09:38.130 Clean 00:09:38.130 12:51:40 -- common/autotest_common.sh@1453 -- # return 0 00:09:38.130 12:51:40 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:09:38.130 12:51:40 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:38.130 12:51:40 -- common/autotest_common.sh@10 -- # set +x 00:09:38.130 12:51:40 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:09:38.130 12:51:40 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:38.130 12:51:40 -- common/autotest_common.sh@10 -- # set +x 00:09:38.130 12:51:41 -- spdk/autotest.sh@392 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:38.130 12:51:41 -- spdk/autotest.sh@394 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:38.130 12:51:41 -- spdk/autotest.sh@394 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:38.130 12:51:41 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:09:38.130 12:51:41 -- spdk/autotest.sh@398 -- # hostname 00:09:38.130 12:51:41 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:38.130 geninfo: WARNING: invalid characters removed from testname! 00:09:44.714 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:44.714 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:51.298 12:51:53 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:57.873 12:52:00 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:03.154 12:52:06 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:08.434 12:52:11 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:13.714 12:52:16 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:18.998 12:52:22 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:24.279 12:52:27 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:24.279 12:52:27 -- spdk/autorun.sh@1 -- $ timing_finish 00:10:24.279 12:52:27 -- common/autotest_common.sh@738 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt ]] 00:10:24.279 12:52:27 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:24.279 12:52:27 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:24.279 12:52:27 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:24.279 + [[ -n 6196 ]] 00:10:24.279 + sudo kill 6196 00:10:24.291 [Pipeline] } 00:10:24.309 [Pipeline] // stage 00:10:24.314 [Pipeline] } 00:10:24.328 [Pipeline] // timeout 00:10:24.333 [Pipeline] } 00:10:24.348 [Pipeline] // catchError 00:10:24.353 [Pipeline] } 00:10:24.368 [Pipeline] // wrap 00:10:24.374 [Pipeline] } 00:10:24.390 [Pipeline] // catchError 00:10:24.401 [Pipeline] stage 00:10:24.403 [Pipeline] { (Epilogue) 00:10:24.418 [Pipeline] catchError 00:10:24.421 [Pipeline] { 00:10:24.435 [Pipeline] echo 00:10:24.436 Cleanup processes 00:10:24.442 [Pipeline] sh 00:10:24.733 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:24.733 171875 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:24.749 [Pipeline] sh 00:10:25.041 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:25.041 ++ grep -v 'sudo pgrep' 00:10:25.041 ++ awk '{print $1}' 00:10:25.041 + sudo kill -9 00:10:25.041 + true 00:10:25.056 [Pipeline] sh 00:10:25.346 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:25.346 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:25.346 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:26.725 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:38.958 [Pipeline] sh 00:10:39.252 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:39.252 Artifacts sizes are good 00:10:39.268 [Pipeline] archiveArtifacts 00:10:39.276 Archiving artifacts 00:10:39.668 [Pipeline] sh 00:10:39.957 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:39.974 [Pipeline] cleanWs 00:10:39.984 [WS-CLEANUP] Deleting project workspace... 00:10:39.984 [WS-CLEANUP] Deferred wipeout is used... 00:10:39.991 [WS-CLEANUP] done 00:10:39.993 [Pipeline] } 00:10:40.012 [Pipeline] // catchError 00:10:40.026 [Pipeline] sh 00:10:40.316 + logger -p user.info -t JENKINS-CI 00:10:40.326 [Pipeline] } 00:10:40.340 [Pipeline] // stage 00:10:40.345 [Pipeline] } 00:10:40.361 [Pipeline] // node 00:10:40.367 [Pipeline] End of Pipeline 00:10:40.410 Finished: SUCCESS