00:00:00.001 Started by upstream project "autotest-spdk-v24.09-vs-dpdk-v22.11" build number 230 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3732 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.001 Started by timer 00:00:00.077 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.082 The recommended git tool is: git 00:00:00.082 using credential 00000000-0000-0000-0000-000000000002 00:00:00.084 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.103 Fetching changes from the remote Git repository 00:00:00.104 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.123 Using shallow fetch with depth 1 00:00:00.123 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.123 > git --version # timeout=10 00:00:00.149 > git --version # 'git version 2.39.2' 00:00:00.149 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.180 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.180 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.370 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.381 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.391 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:04.391 > git config core.sparsecheckout # timeout=10 00:00:04.402 > git read-tree -mu HEAD # timeout=10 00:00:04.416 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:04.432 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:04.432 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:04.509 [Pipeline] Start of Pipeline 00:00:04.518 [Pipeline] library 00:00:04.519 Loading library shm_lib@master 00:00:04.519 Library shm_lib@master is cached. Copying from home. 00:00:04.532 [Pipeline] node 00:00:04.548 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:04.549 [Pipeline] { 00:00:04.557 [Pipeline] catchError 00:00:04.558 [Pipeline] { 00:00:04.568 [Pipeline] wrap 00:00:04.576 [Pipeline] { 00:00:04.583 [Pipeline] stage 00:00:04.585 [Pipeline] { (Prologue) 00:00:04.783 [Pipeline] sh 00:00:05.065 + logger -p user.info -t JENKINS-CI 00:00:05.079 [Pipeline] echo 00:00:05.081 Node: WFP20 00:00:05.089 [Pipeline] sh 00:00:05.390 [Pipeline] setCustomBuildProperty 00:00:05.402 [Pipeline] echo 00:00:05.404 Cleanup processes 00:00:05.409 [Pipeline] sh 00:00:05.693 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.693 682499 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.707 [Pipeline] sh 00:00:05.993 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.993 ++ grep -v 'sudo pgrep' 00:00:05.993 ++ awk '{print $1}' 00:00:05.993 + sudo kill -9 00:00:05.993 + true 00:00:06.005 [Pipeline] cleanWs 00:00:06.013 [WS-CLEANUP] Deleting project workspace... 00:00:06.013 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.019 [WS-CLEANUP] done 00:00:06.023 [Pipeline] setCustomBuildProperty 00:00:06.034 [Pipeline] sh 00:00:06.318 + sudo git config --global --replace-all safe.directory '*' 00:00:06.400 [Pipeline] httpRequest 00:00:07.222 [Pipeline] echo 00:00:07.224 Sorcerer 10.211.164.20 is alive 00:00:07.232 [Pipeline] retry 00:00:07.234 [Pipeline] { 00:00:07.246 [Pipeline] httpRequest 00:00:07.251 HttpMethod: GET 00:00:07.251 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:07.252 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:07.270 Response Code: HTTP/1.1 200 OK 00:00:07.271 Success: Status code 200 is in the accepted range: 200,404 00:00:07.271 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:28.993 [Pipeline] } 00:00:29.010 [Pipeline] // retry 00:00:29.017 [Pipeline] sh 00:00:29.302 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:29.317 [Pipeline] httpRequest 00:00:29.693 [Pipeline] echo 00:00:29.695 Sorcerer 10.211.164.20 is alive 00:00:29.704 [Pipeline] retry 00:00:29.706 [Pipeline] { 00:00:29.719 [Pipeline] httpRequest 00:00:29.723 HttpMethod: GET 00:00:29.724 URL: http://10.211.164.20/packages/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:00:29.724 Sending request to url: http://10.211.164.20/packages/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:00:29.733 Response Code: HTTP/1.1 200 OK 00:00:29.734 Success: Status code 200 is in the accepted range: 200,404 00:00:29.734 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:01:49.362 [Pipeline] } 00:01:49.373 [Pipeline] // retry 00:01:49.378 [Pipeline] sh 00:01:49.658 + tar --no-same-owner -xf spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:01:52.206 [Pipeline] sh 00:01:52.490 + git -C spdk log --oneline -n5 00:01:52.490 b18e1bd62 version: v24.09.1-pre 00:01:52.490 19524ad45 version: v24.09 00:01:52.490 9756b40a3 dpdk: update submodule to include alarm_cancel fix 00:01:52.490 a808500d2 test/nvmf: disable nvmf_shutdown_tc4 on e810 00:01:52.490 3024272c6 bdev/nvme: take nvme_ctrlr.mutex when setting keys 00:01:52.506 [Pipeline] withCredentials 00:01:52.516 > git --version # timeout=10 00:01:52.528 > git --version # 'git version 2.39.2' 00:01:52.545 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:52.547 [Pipeline] { 00:01:52.556 [Pipeline] retry 00:01:52.558 [Pipeline] { 00:01:52.572 [Pipeline] sh 00:01:52.855 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:53.126 [Pipeline] } 00:01:53.144 [Pipeline] // retry 00:01:53.149 [Pipeline] } 00:01:53.165 [Pipeline] // withCredentials 00:01:53.174 [Pipeline] httpRequest 00:01:53.555 [Pipeline] echo 00:01:53.556 Sorcerer 10.211.164.20 is alive 00:01:53.566 [Pipeline] retry 00:01:53.568 [Pipeline] { 00:01:53.582 [Pipeline] httpRequest 00:01:53.587 HttpMethod: GET 00:01:53.587 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:53.588 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:53.591 Response Code: HTTP/1.1 200 OK 00:01:53.591 Success: Status code 200 is in the accepted range: 200,404 00:01:53.592 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:59.875 [Pipeline] } 00:01:59.892 [Pipeline] // retry 00:01:59.899 [Pipeline] sh 00:02:00.184 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:02:01.575 [Pipeline] sh 00:02:01.860 + git -C dpdk log --oneline -n5 00:02:01.860 caf0f5d395 version: 22.11.4 00:02:01.860 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:02:01.860 dc9c799c7d vhost: fix missing spinlock unlock 00:02:01.860 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:02:01.860 6ef77f2a5e net/gve: fix RX buffer size alignment 00:02:01.870 [Pipeline] } 00:02:01.884 [Pipeline] // stage 00:02:01.893 [Pipeline] stage 00:02:01.895 [Pipeline] { (Prepare) 00:02:01.912 [Pipeline] writeFile 00:02:01.927 [Pipeline] sh 00:02:02.211 + logger -p user.info -t JENKINS-CI 00:02:02.224 [Pipeline] sh 00:02:02.509 + logger -p user.info -t JENKINS-CI 00:02:02.521 [Pipeline] sh 00:02:02.806 + cat autorun-spdk.conf 00:02:02.806 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:02.806 SPDK_TEST_FUZZER_SHORT=1 00:02:02.806 SPDK_TEST_FUZZER=1 00:02:02.806 SPDK_TEST_SETUP=1 00:02:02.806 SPDK_RUN_UBSAN=1 00:02:02.806 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:02.806 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:02.814 RUN_NIGHTLY=1 00:02:02.819 [Pipeline] readFile 00:02:02.842 [Pipeline] withEnv 00:02:02.845 [Pipeline] { 00:02:02.857 [Pipeline] sh 00:02:03.140 + set -ex 00:02:03.140 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:02:03.140 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:03.140 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:03.140 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:03.140 ++ SPDK_TEST_FUZZER=1 00:02:03.140 ++ SPDK_TEST_SETUP=1 00:02:03.140 ++ SPDK_RUN_UBSAN=1 00:02:03.140 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:03.140 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:03.140 ++ RUN_NIGHTLY=1 00:02:03.140 + case $SPDK_TEST_NVMF_NICS in 00:02:03.140 + DRIVERS= 00:02:03.140 + [[ -n '' ]] 00:02:03.140 + exit 0 00:02:03.150 [Pipeline] } 00:02:03.165 [Pipeline] // withEnv 00:02:03.170 [Pipeline] } 00:02:03.184 [Pipeline] // stage 00:02:03.194 [Pipeline] catchError 00:02:03.196 [Pipeline] { 00:02:03.210 [Pipeline] timeout 00:02:03.210 Timeout set to expire in 30 min 00:02:03.212 [Pipeline] { 00:02:03.226 [Pipeline] stage 00:02:03.228 [Pipeline] { (Tests) 00:02:03.242 [Pipeline] sh 00:02:03.528 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:03.528 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:03.528 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:02:03.528 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:02:03.528 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:03.528 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:03.529 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:02:03.529 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:03.529 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:03.529 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:03.529 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:02:03.529 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:03.529 + source /etc/os-release 00:02:03.529 ++ NAME='Fedora Linux' 00:02:03.529 ++ VERSION='39 (Cloud Edition)' 00:02:03.529 ++ ID=fedora 00:02:03.529 ++ VERSION_ID=39 00:02:03.529 ++ VERSION_CODENAME= 00:02:03.529 ++ PLATFORM_ID=platform:f39 00:02:03.529 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:03.529 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:03.529 ++ LOGO=fedora-logo-icon 00:02:03.529 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:03.529 ++ HOME_URL=https://fedoraproject.org/ 00:02:03.529 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:03.529 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:03.529 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:03.529 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:03.529 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:03.529 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:03.529 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:03.529 ++ SUPPORT_END=2024-11-12 00:02:03.529 ++ VARIANT='Cloud Edition' 00:02:03.529 ++ VARIANT_ID=cloud 00:02:03.529 + uname -a 00:02:03.529 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:03.529 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:06.821 Hugepages 00:02:06.821 node hugesize free / total 00:02:06.821 node0 1048576kB 0 / 0 00:02:06.821 node0 2048kB 0 / 0 00:02:06.821 node1 1048576kB 0 / 0 00:02:06.821 node1 2048kB 0 / 0 00:02:06.821 00:02:06.821 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:06.821 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:06.821 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:06.821 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:06.821 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:06.821 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:06.821 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:06.821 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:06.821 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:06.821 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:06.821 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:06.821 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:06.821 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:06.821 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:06.821 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:06.821 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:06.821 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:06.821 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:06.821 + rm -f /tmp/spdk-ld-path 00:02:06.821 + source autorun-spdk.conf 00:02:06.821 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:06.821 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:06.821 ++ SPDK_TEST_FUZZER=1 00:02:06.821 ++ SPDK_TEST_SETUP=1 00:02:06.821 ++ SPDK_RUN_UBSAN=1 00:02:06.821 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:06.821 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.821 ++ RUN_NIGHTLY=1 00:02:06.821 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:06.821 + [[ -n '' ]] 00:02:06.821 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:06.821 + for M in /var/spdk/build-*-manifest.txt 00:02:06.821 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:06.821 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.821 + for M in /var/spdk/build-*-manifest.txt 00:02:06.821 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:06.821 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.821 + for M in /var/spdk/build-*-manifest.txt 00:02:06.821 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:06.821 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.821 ++ uname 00:02:06.821 + [[ Linux == \L\i\n\u\x ]] 00:02:06.821 + sudo dmesg -T 00:02:06.821 + sudo dmesg --clear 00:02:06.821 + dmesg_pid=683434 00:02:06.821 + [[ Fedora Linux == FreeBSD ]] 00:02:06.821 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:06.821 + sudo dmesg -Tw 00:02:06.821 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:06.821 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:06.821 + [[ -x /usr/src/fio-static/fio ]] 00:02:06.821 + export FIO_BIN=/usr/src/fio-static/fio 00:02:06.821 + FIO_BIN=/usr/src/fio-static/fio 00:02:06.822 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:06.822 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:06.822 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:06.822 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:06.822 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:06.822 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:06.822 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:06.822 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:06.822 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:06.822 Test configuration: 00:02:06.822 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:06.822 SPDK_TEST_FUZZER_SHORT=1 00:02:06.822 SPDK_TEST_FUZZER=1 00:02:06.822 SPDK_TEST_SETUP=1 00:02:06.822 SPDK_RUN_UBSAN=1 00:02:06.822 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:02:06.822 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.822 RUN_NIGHTLY=1 01:17:52 -- common/autotest_common.sh@1680 -- $ [[ n == y ]] 00:02:06.822 01:17:52 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:06.822 01:17:52 -- scripts/common.sh@15 -- $ shopt -s extglob 00:02:06.822 01:17:52 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:06.822 01:17:52 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:06.822 01:17:52 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:06.822 01:17:52 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.822 01:17:52 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.822 01:17:52 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.822 01:17:52 -- paths/export.sh@5 -- $ export PATH 00:02:06.822 01:17:52 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.822 01:17:52 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:06.822 01:17:52 -- common/autobuild_common.sh@479 -- $ date +%s 00:02:06.822 01:17:52 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1734394672.XXXXXX 00:02:06.822 01:17:52 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1734394672.CqMHIR 00:02:06.822 01:17:52 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:02:06.822 01:17:52 -- common/autobuild_common.sh@485 -- $ '[' -n v22.11.4 ']' 00:02:06.822 01:17:52 -- common/autobuild_common.sh@486 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.822 01:17:52 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:02:06.822 01:17:52 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:06.822 01:17:52 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:06.822 01:17:52 -- common/autobuild_common.sh@495 -- $ get_config_params 00:02:06.822 01:17:52 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:02:06.822 01:17:52 -- common/autotest_common.sh@10 -- $ set +x 00:02:07.081 01:17:52 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:02:07.081 01:17:52 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:02:07.081 01:17:52 -- pm/common@17 -- $ local monitor 00:02:07.081 01:17:52 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:07.081 01:17:52 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:07.081 01:17:52 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:07.081 01:17:52 -- pm/common@21 -- $ date +%s 00:02:07.081 01:17:52 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:07.081 01:17:52 -- pm/common@21 -- $ date +%s 00:02:07.081 01:17:52 -- pm/common@25 -- $ sleep 1 00:02:07.081 01:17:52 -- pm/common@21 -- $ date +%s 00:02:07.081 01:17:52 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1734394672 00:02:07.081 01:17:52 -- pm/common@21 -- $ date +%s 00:02:07.081 01:17:52 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1734394672 00:02:07.081 01:17:52 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1734394672 00:02:07.081 01:17:52 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1734394672 00:02:07.081 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1734394672_collect-cpu-load.pm.log 00:02:07.081 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1734394672_collect-vmstat.pm.log 00:02:07.081 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1734394672_collect-cpu-temp.pm.log 00:02:07.081 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1734394672_collect-bmc-pm.bmc.pm.log 00:02:08.021 01:17:53 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:02:08.021 01:17:53 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:08.021 01:17:53 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:08.021 01:17:53 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:08.021 01:17:53 -- spdk/autobuild.sh@16 -- $ date -u 00:02:08.021 Tue Dec 17 12:17:53 AM UTC 2024 00:02:08.021 01:17:53 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:08.021 v24.09-1-gb18e1bd62 00:02:08.021 01:17:53 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:08.021 01:17:53 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:08.021 01:17:53 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:08.021 01:17:53 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:02:08.021 01:17:53 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:08.021 01:17:53 -- common/autotest_common.sh@10 -- $ set +x 00:02:08.021 ************************************ 00:02:08.021 START TEST ubsan 00:02:08.021 ************************************ 00:02:08.021 01:17:53 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:02:08.021 using ubsan 00:02:08.021 00:02:08.021 real 0m0.001s 00:02:08.021 user 0m0.001s 00:02:08.021 sys 0m0.000s 00:02:08.021 01:17:53 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:08.021 01:17:53 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:08.021 ************************************ 00:02:08.021 END TEST ubsan 00:02:08.021 ************************************ 00:02:08.021 01:17:53 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:02:08.021 01:17:53 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:08.021 01:17:53 -- common/autobuild_common.sh@442 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:08.021 01:17:53 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:02:08.021 01:17:53 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:08.021 01:17:53 -- common/autotest_common.sh@10 -- $ set +x 00:02:08.021 ************************************ 00:02:08.021 START TEST build_native_dpdk 00:02:08.021 ************************************ 00:02:08.021 01:17:53 build_native_dpdk -- common/autotest_common.sh@1125 -- $ _build_native_dpdk 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:08.021 01:17:53 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:02:08.021 caf0f5d395 version: 22.11.4 00:02:08.021 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:02:08.021 dc9c799c7d vhost: fix missing spinlock unlock 00:02:08.021 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:02:08.021 6ef77f2a5e net/gve: fix RX buffer size alignment 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:02:08.021 01:17:54 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:08.021 01:17:54 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:02:08.281 patching file config/rte_config.h 00:02:08.281 Hunk #1 succeeded at 60 (offset 1 line). 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:02:08.281 patching file lib/pcapng/rte_pcapng.c 00:02:08.281 Hunk #1 succeeded at 110 (offset -18 lines). 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@179 -- $ ge 22.11.4 24.07.0 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 22.11.4 '>=' 24.07.0 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:08.281 01:17:54 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@183 -- $ dpdk_kmods=false 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@184 -- $ uname -s 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@184 -- $ '[' Linux = FreeBSD ']' 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@188 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:02:08.281 01:17:54 build_native_dpdk -- common/autobuild_common.sh@188 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:13.662 The Meson build system 00:02:13.662 Version: 1.5.0 00:02:13.662 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:13.662 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:13.662 Build type: native build 00:02:13.662 Program cat found: YES (/usr/bin/cat) 00:02:13.662 Project name: DPDK 00:02:13.662 Project version: 22.11.4 00:02:13.662 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:13.662 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:13.662 Host machine cpu family: x86_64 00:02:13.662 Host machine cpu: x86_64 00:02:13.662 Message: ## Building in Developer Mode ## 00:02:13.662 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:13.662 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:13.662 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:13.662 Program objdump found: YES (/usr/bin/objdump) 00:02:13.662 Program python3 found: YES (/usr/bin/python3) 00:02:13.662 Program cat found: YES (/usr/bin/cat) 00:02:13.662 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:13.662 Checking for size of "void *" : 8 00:02:13.662 Checking for size of "void *" : 8 (cached) 00:02:13.662 Library m found: YES 00:02:13.662 Library numa found: YES 00:02:13.662 Has header "numaif.h" : YES 00:02:13.662 Library fdt found: NO 00:02:13.662 Library execinfo found: NO 00:02:13.662 Has header "execinfo.h" : YES 00:02:13.662 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:13.662 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:13.662 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:13.662 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:13.662 Run-time dependency openssl found: YES 3.1.1 00:02:13.662 Run-time dependency libpcap found: YES 1.10.4 00:02:13.662 Has header "pcap.h" with dependency libpcap: YES 00:02:13.662 Compiler for C supports arguments -Wcast-qual: YES 00:02:13.662 Compiler for C supports arguments -Wdeprecated: YES 00:02:13.662 Compiler for C supports arguments -Wformat: YES 00:02:13.662 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:13.662 Compiler for C supports arguments -Wformat-security: NO 00:02:13.662 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:13.662 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:13.662 Compiler for C supports arguments -Wnested-externs: YES 00:02:13.662 Compiler for C supports arguments -Wold-style-definition: YES 00:02:13.662 Compiler for C supports arguments -Wpointer-arith: YES 00:02:13.662 Compiler for C supports arguments -Wsign-compare: YES 00:02:13.662 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:13.662 Compiler for C supports arguments -Wundef: YES 00:02:13.662 Compiler for C supports arguments -Wwrite-strings: YES 00:02:13.662 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:13.662 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:13.662 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:13.662 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:13.662 Compiler for C supports arguments -mavx512f: YES 00:02:13.662 Checking if "AVX512 checking" compiles: YES 00:02:13.662 Fetching value of define "__SSE4_2__" : 1 00:02:13.662 Fetching value of define "__AES__" : 1 00:02:13.662 Fetching value of define "__AVX__" : 1 00:02:13.662 Fetching value of define "__AVX2__" : 1 00:02:13.662 Fetching value of define "__AVX512BW__" : 1 00:02:13.662 Fetching value of define "__AVX512CD__" : 1 00:02:13.662 Fetching value of define "__AVX512DQ__" : 1 00:02:13.662 Fetching value of define "__AVX512F__" : 1 00:02:13.662 Fetching value of define "__AVX512VL__" : 1 00:02:13.662 Fetching value of define "__PCLMUL__" : 1 00:02:13.662 Fetching value of define "__RDRND__" : 1 00:02:13.662 Fetching value of define "__RDSEED__" : 1 00:02:13.662 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:13.662 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:13.662 Message: lib/kvargs: Defining dependency "kvargs" 00:02:13.662 Message: lib/telemetry: Defining dependency "telemetry" 00:02:13.662 Checking for function "getentropy" : YES 00:02:13.662 Message: lib/eal: Defining dependency "eal" 00:02:13.662 Message: lib/ring: Defining dependency "ring" 00:02:13.662 Message: lib/rcu: Defining dependency "rcu" 00:02:13.662 Message: lib/mempool: Defining dependency "mempool" 00:02:13.662 Message: lib/mbuf: Defining dependency "mbuf" 00:02:13.662 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:13.662 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:13.662 Compiler for C supports arguments -mpclmul: YES 00:02:13.662 Compiler for C supports arguments -maes: YES 00:02:13.662 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:13.662 Compiler for C supports arguments -mavx512bw: YES 00:02:13.662 Compiler for C supports arguments -mavx512dq: YES 00:02:13.662 Compiler for C supports arguments -mavx512vl: YES 00:02:13.662 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:13.662 Compiler for C supports arguments -mavx2: YES 00:02:13.662 Compiler for C supports arguments -mavx: YES 00:02:13.662 Message: lib/net: Defining dependency "net" 00:02:13.662 Message: lib/meter: Defining dependency "meter" 00:02:13.662 Message: lib/ethdev: Defining dependency "ethdev" 00:02:13.662 Message: lib/pci: Defining dependency "pci" 00:02:13.662 Message: lib/cmdline: Defining dependency "cmdline" 00:02:13.662 Message: lib/metrics: Defining dependency "metrics" 00:02:13.662 Message: lib/hash: Defining dependency "hash" 00:02:13.662 Message: lib/timer: Defining dependency "timer" 00:02:13.662 Fetching value of define "__AVX2__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.662 Message: lib/acl: Defining dependency "acl" 00:02:13.662 Message: lib/bbdev: Defining dependency "bbdev" 00:02:13.662 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:13.662 Run-time dependency libelf found: YES 0.191 00:02:13.662 Message: lib/bpf: Defining dependency "bpf" 00:02:13.662 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:13.662 Message: lib/compressdev: Defining dependency "compressdev" 00:02:13.662 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:13.662 Message: lib/distributor: Defining dependency "distributor" 00:02:13.662 Message: lib/efd: Defining dependency "efd" 00:02:13.662 Message: lib/eventdev: Defining dependency "eventdev" 00:02:13.662 Message: lib/gpudev: Defining dependency "gpudev" 00:02:13.662 Message: lib/gro: Defining dependency "gro" 00:02:13.662 Message: lib/gso: Defining dependency "gso" 00:02:13.662 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:13.662 Message: lib/jobstats: Defining dependency "jobstats" 00:02:13.662 Message: lib/latencystats: Defining dependency "latencystats" 00:02:13.662 Message: lib/lpm: Defining dependency "lpm" 00:02:13.662 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:13.662 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:13.662 Message: lib/member: Defining dependency "member" 00:02:13.662 Message: lib/pcapng: Defining dependency "pcapng" 00:02:13.662 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:13.662 Message: lib/power: Defining dependency "power" 00:02:13.662 Message: lib/rawdev: Defining dependency "rawdev" 00:02:13.662 Message: lib/regexdev: Defining dependency "regexdev" 00:02:13.662 Message: lib/dmadev: Defining dependency "dmadev" 00:02:13.662 Message: lib/rib: Defining dependency "rib" 00:02:13.662 Message: lib/reorder: Defining dependency "reorder" 00:02:13.662 Message: lib/sched: Defining dependency "sched" 00:02:13.662 Message: lib/security: Defining dependency "security" 00:02:13.662 Message: lib/stack: Defining dependency "stack" 00:02:13.662 Has header "linux/userfaultfd.h" : YES 00:02:13.662 Message: lib/vhost: Defining dependency "vhost" 00:02:13.662 Message: lib/ipsec: Defining dependency "ipsec" 00:02:13.662 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:13.662 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.662 Message: lib/fib: Defining dependency "fib" 00:02:13.662 Message: lib/port: Defining dependency "port" 00:02:13.662 Message: lib/pdump: Defining dependency "pdump" 00:02:13.662 Message: lib/table: Defining dependency "table" 00:02:13.662 Message: lib/pipeline: Defining dependency "pipeline" 00:02:13.662 Message: lib/graph: Defining dependency "graph" 00:02:13.662 Message: lib/node: Defining dependency "node" 00:02:13.662 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:13.662 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:13.662 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:13.662 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:13.662 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:13.662 Compiler for C supports arguments -Wno-unused-value: YES 00:02:13.662 Compiler for C supports arguments -Wno-format: YES 00:02:13.662 Compiler for C supports arguments -Wno-format-security: YES 00:02:13.662 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:13.922 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:13.922 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:13.922 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:13.922 Fetching value of define "__AVX2__" : 1 (cached) 00:02:13.922 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.922 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.922 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:13.922 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:13.922 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:13.922 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:13.922 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:13.922 Configuring doxy-api.conf using configuration 00:02:13.922 Program sphinx-build found: NO 00:02:13.922 Configuring rte_build_config.h using configuration 00:02:13.922 Message: 00:02:13.922 ================= 00:02:13.922 Applications Enabled 00:02:13.922 ================= 00:02:13.922 00:02:13.922 apps: 00:02:13.922 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:02:13.922 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:02:13.922 test-security-perf, 00:02:13.922 00:02:13.922 Message: 00:02:13.922 ================= 00:02:13.922 Libraries Enabled 00:02:13.922 ================= 00:02:13.922 00:02:13.922 libs: 00:02:13.922 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:02:13.922 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:02:13.922 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:02:13.922 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:02:13.922 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:02:13.923 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:02:13.923 table, pipeline, graph, node, 00:02:13.923 00:02:13.923 Message: 00:02:13.923 =============== 00:02:13.923 Drivers Enabled 00:02:13.923 =============== 00:02:13.923 00:02:13.923 common: 00:02:13.923 00:02:13.923 bus: 00:02:13.923 pci, vdev, 00:02:13.923 mempool: 00:02:13.923 ring, 00:02:13.923 dma: 00:02:13.923 00:02:13.923 net: 00:02:13.923 i40e, 00:02:13.923 raw: 00:02:13.923 00:02:13.923 crypto: 00:02:13.923 00:02:13.923 compress: 00:02:13.923 00:02:13.923 regex: 00:02:13.923 00:02:13.923 vdpa: 00:02:13.923 00:02:13.923 event: 00:02:13.923 00:02:13.923 baseband: 00:02:13.923 00:02:13.923 gpu: 00:02:13.923 00:02:13.923 00:02:13.923 Message: 00:02:13.923 ================= 00:02:13.923 Content Skipped 00:02:13.923 ================= 00:02:13.923 00:02:13.923 apps: 00:02:13.923 00:02:13.923 libs: 00:02:13.923 kni: explicitly disabled via build config (deprecated lib) 00:02:13.923 flow_classify: explicitly disabled via build config (deprecated lib) 00:02:13.923 00:02:13.923 drivers: 00:02:13.923 common/cpt: not in enabled drivers build config 00:02:13.923 common/dpaax: not in enabled drivers build config 00:02:13.923 common/iavf: not in enabled drivers build config 00:02:13.923 common/idpf: not in enabled drivers build config 00:02:13.923 common/mvep: not in enabled drivers build config 00:02:13.923 common/octeontx: not in enabled drivers build config 00:02:13.923 bus/auxiliary: not in enabled drivers build config 00:02:13.923 bus/dpaa: not in enabled drivers build config 00:02:13.923 bus/fslmc: not in enabled drivers build config 00:02:13.923 bus/ifpga: not in enabled drivers build config 00:02:13.923 bus/vmbus: not in enabled drivers build config 00:02:13.923 common/cnxk: not in enabled drivers build config 00:02:13.923 common/mlx5: not in enabled drivers build config 00:02:13.923 common/qat: not in enabled drivers build config 00:02:13.923 common/sfc_efx: not in enabled drivers build config 00:02:13.923 mempool/bucket: not in enabled drivers build config 00:02:13.923 mempool/cnxk: not in enabled drivers build config 00:02:13.923 mempool/dpaa: not in enabled drivers build config 00:02:13.923 mempool/dpaa2: not in enabled drivers build config 00:02:13.923 mempool/octeontx: not in enabled drivers build config 00:02:13.923 mempool/stack: not in enabled drivers build config 00:02:13.923 dma/cnxk: not in enabled drivers build config 00:02:13.923 dma/dpaa: not in enabled drivers build config 00:02:13.923 dma/dpaa2: not in enabled drivers build config 00:02:13.923 dma/hisilicon: not in enabled drivers build config 00:02:13.923 dma/idxd: not in enabled drivers build config 00:02:13.923 dma/ioat: not in enabled drivers build config 00:02:13.923 dma/skeleton: not in enabled drivers build config 00:02:13.923 net/af_packet: not in enabled drivers build config 00:02:13.923 net/af_xdp: not in enabled drivers build config 00:02:13.923 net/ark: not in enabled drivers build config 00:02:13.923 net/atlantic: not in enabled drivers build config 00:02:13.923 net/avp: not in enabled drivers build config 00:02:13.923 net/axgbe: not in enabled drivers build config 00:02:13.923 net/bnx2x: not in enabled drivers build config 00:02:13.923 net/bnxt: not in enabled drivers build config 00:02:13.923 net/bonding: not in enabled drivers build config 00:02:13.923 net/cnxk: not in enabled drivers build config 00:02:13.923 net/cxgbe: not in enabled drivers build config 00:02:13.923 net/dpaa: not in enabled drivers build config 00:02:13.923 net/dpaa2: not in enabled drivers build config 00:02:13.923 net/e1000: not in enabled drivers build config 00:02:13.923 net/ena: not in enabled drivers build config 00:02:13.923 net/enetc: not in enabled drivers build config 00:02:13.923 net/enetfec: not in enabled drivers build config 00:02:13.923 net/enic: not in enabled drivers build config 00:02:13.923 net/failsafe: not in enabled drivers build config 00:02:13.923 net/fm10k: not in enabled drivers build config 00:02:13.923 net/gve: not in enabled drivers build config 00:02:13.923 net/hinic: not in enabled drivers build config 00:02:13.923 net/hns3: not in enabled drivers build config 00:02:13.923 net/iavf: not in enabled drivers build config 00:02:13.923 net/ice: not in enabled drivers build config 00:02:13.923 net/idpf: not in enabled drivers build config 00:02:13.923 net/igc: not in enabled drivers build config 00:02:13.923 net/ionic: not in enabled drivers build config 00:02:13.923 net/ipn3ke: not in enabled drivers build config 00:02:13.923 net/ixgbe: not in enabled drivers build config 00:02:13.923 net/kni: not in enabled drivers build config 00:02:13.923 net/liquidio: not in enabled drivers build config 00:02:13.923 net/mana: not in enabled drivers build config 00:02:13.923 net/memif: not in enabled drivers build config 00:02:13.923 net/mlx4: not in enabled drivers build config 00:02:13.923 net/mlx5: not in enabled drivers build config 00:02:13.923 net/mvneta: not in enabled drivers build config 00:02:13.923 net/mvpp2: not in enabled drivers build config 00:02:13.923 net/netvsc: not in enabled drivers build config 00:02:13.923 net/nfb: not in enabled drivers build config 00:02:13.923 net/nfp: not in enabled drivers build config 00:02:13.923 net/ngbe: not in enabled drivers build config 00:02:13.923 net/null: not in enabled drivers build config 00:02:13.923 net/octeontx: not in enabled drivers build config 00:02:13.923 net/octeon_ep: not in enabled drivers build config 00:02:13.923 net/pcap: not in enabled drivers build config 00:02:13.923 net/pfe: not in enabled drivers build config 00:02:13.923 net/qede: not in enabled drivers build config 00:02:13.923 net/ring: not in enabled drivers build config 00:02:13.923 net/sfc: not in enabled drivers build config 00:02:13.923 net/softnic: not in enabled drivers build config 00:02:13.923 net/tap: not in enabled drivers build config 00:02:13.923 net/thunderx: not in enabled drivers build config 00:02:13.923 net/txgbe: not in enabled drivers build config 00:02:13.923 net/vdev_netvsc: not in enabled drivers build config 00:02:13.923 net/vhost: not in enabled drivers build config 00:02:13.923 net/virtio: not in enabled drivers build config 00:02:13.923 net/vmxnet3: not in enabled drivers build config 00:02:13.923 raw/cnxk_bphy: not in enabled drivers build config 00:02:13.923 raw/cnxk_gpio: not in enabled drivers build config 00:02:13.923 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:13.923 raw/ifpga: not in enabled drivers build config 00:02:13.923 raw/ntb: not in enabled drivers build config 00:02:13.923 raw/skeleton: not in enabled drivers build config 00:02:13.923 crypto/armv8: not in enabled drivers build config 00:02:13.923 crypto/bcmfs: not in enabled drivers build config 00:02:13.923 crypto/caam_jr: not in enabled drivers build config 00:02:13.923 crypto/ccp: not in enabled drivers build config 00:02:13.923 crypto/cnxk: not in enabled drivers build config 00:02:13.923 crypto/dpaa_sec: not in enabled drivers build config 00:02:13.923 crypto/dpaa2_sec: not in enabled drivers build config 00:02:13.923 crypto/ipsec_mb: not in enabled drivers build config 00:02:13.923 crypto/mlx5: not in enabled drivers build config 00:02:13.923 crypto/mvsam: not in enabled drivers build config 00:02:13.923 crypto/nitrox: not in enabled drivers build config 00:02:13.923 crypto/null: not in enabled drivers build config 00:02:13.923 crypto/octeontx: not in enabled drivers build config 00:02:13.923 crypto/openssl: not in enabled drivers build config 00:02:13.923 crypto/scheduler: not in enabled drivers build config 00:02:13.923 crypto/uadk: not in enabled drivers build config 00:02:13.923 crypto/virtio: not in enabled drivers build config 00:02:13.923 compress/isal: not in enabled drivers build config 00:02:13.923 compress/mlx5: not in enabled drivers build config 00:02:13.923 compress/octeontx: not in enabled drivers build config 00:02:13.923 compress/zlib: not in enabled drivers build config 00:02:13.923 regex/mlx5: not in enabled drivers build config 00:02:13.923 regex/cn9k: not in enabled drivers build config 00:02:13.923 vdpa/ifc: not in enabled drivers build config 00:02:13.923 vdpa/mlx5: not in enabled drivers build config 00:02:13.923 vdpa/sfc: not in enabled drivers build config 00:02:13.923 event/cnxk: not in enabled drivers build config 00:02:13.923 event/dlb2: not in enabled drivers build config 00:02:13.923 event/dpaa: not in enabled drivers build config 00:02:13.923 event/dpaa2: not in enabled drivers build config 00:02:13.923 event/dsw: not in enabled drivers build config 00:02:13.923 event/opdl: not in enabled drivers build config 00:02:13.923 event/skeleton: not in enabled drivers build config 00:02:13.923 event/sw: not in enabled drivers build config 00:02:13.923 event/octeontx: not in enabled drivers build config 00:02:13.923 baseband/acc: not in enabled drivers build config 00:02:13.923 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:13.923 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:13.923 baseband/la12xx: not in enabled drivers build config 00:02:13.923 baseband/null: not in enabled drivers build config 00:02:13.923 baseband/turbo_sw: not in enabled drivers build config 00:02:13.923 gpu/cuda: not in enabled drivers build config 00:02:13.923 00:02:13.923 00:02:13.923 Build targets in project: 311 00:02:13.923 00:02:13.923 DPDK 22.11.4 00:02:13.923 00:02:13.923 User defined options 00:02:13.923 libdir : lib 00:02:13.923 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.923 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:13.923 c_link_args : 00:02:13.923 enable_docs : false 00:02:13.923 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:13.923 enable_kmods : false 00:02:13.923 machine : native 00:02:13.923 tests : false 00:02:13.923 00:02:13.923 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:13.923 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:14.193 01:17:59 build_native_dpdk -- common/autobuild_common.sh@192 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:14.193 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:14.193 [1/740] Generating lib/rte_kvargs_def with a custom command 00:02:14.193 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:02:14.193 [3/740] Generating lib/rte_telemetry_def with a custom command 00:02:14.193 [4/740] Generating lib/rte_telemetry_mingw with a custom command 00:02:14.193 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:14.193 [6/740] Generating lib/rte_mempool_def with a custom command 00:02:14.193 [7/740] Generating lib/rte_mempool_mingw with a custom command 00:02:14.193 [8/740] Generating lib/rte_mbuf_def with a custom command 00:02:14.193 [9/740] Generating lib/rte_ring_def with a custom command 00:02:14.193 [10/740] Generating lib/rte_eal_def with a custom command 00:02:14.193 [11/740] Generating lib/rte_ring_mingw with a custom command 00:02:14.461 [12/740] Generating lib/rte_rcu_def with a custom command 00:02:14.461 [13/740] Generating lib/rte_rcu_mingw with a custom command 00:02:14.461 [14/740] Generating lib/rte_mbuf_mingw with a custom command 00:02:14.461 [15/740] Generating lib/rte_net_def with a custom command 00:02:14.461 [16/740] Generating lib/rte_meter_def with a custom command 00:02:14.461 [17/740] Generating lib/rte_meter_mingw with a custom command 00:02:14.461 [18/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:14.461 [19/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:14.461 [20/740] Generating lib/rte_eal_mingw with a custom command 00:02:14.461 [21/740] Generating lib/rte_net_mingw with a custom command 00:02:14.461 [22/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:14.461 [23/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:14.461 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:14.461 [25/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:14.461 [26/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:14.461 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:14.461 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:14.461 [29/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:02:14.461 [30/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:14.461 [31/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:14.461 [32/740] Generating lib/rte_ethdev_mingw with a custom command 00:02:14.461 [33/740] Generating lib/rte_pci_mingw with a custom command 00:02:14.461 [34/740] Generating lib/rte_ethdev_def with a custom command 00:02:14.461 [35/740] Generating lib/rte_pci_def with a custom command 00:02:14.461 [36/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:14.461 [37/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:14.461 [38/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:14.461 [39/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:14.461 [40/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:14.461 [41/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:14.461 [42/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:14.461 [43/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:14.461 [44/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:14.461 [45/740] Generating lib/rte_cmdline_def with a custom command 00:02:14.461 [46/740] Generating lib/rte_metrics_def with a custom command 00:02:14.461 [47/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:14.461 [48/740] Generating lib/rte_cmdline_mingw with a custom command 00:02:14.461 [49/740] Generating lib/rte_metrics_mingw with a custom command 00:02:14.461 [50/740] Linking static target lib/librte_kvargs.a 00:02:14.461 [51/740] Generating lib/rte_hash_def with a custom command 00:02:14.461 [52/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:14.461 [53/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:14.461 [54/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:14.461 [55/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:14.461 [56/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:14.461 [57/740] Generating lib/rte_hash_mingw with a custom command 00:02:14.461 [58/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:14.461 [59/740] Generating lib/rte_timer_def with a custom command 00:02:14.461 [60/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:14.461 [61/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:14.461 [62/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:14.461 [63/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:14.461 [64/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:14.461 [65/740] Generating lib/rte_timer_mingw with a custom command 00:02:14.461 [66/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:14.461 [67/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:14.461 [68/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:14.461 [69/740] Generating lib/rte_acl_mingw with a custom command 00:02:14.461 [70/740] Generating lib/rte_acl_def with a custom command 00:02:14.461 [71/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:14.461 [72/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:14.461 [73/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:14.461 [74/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:14.461 [75/740] Generating lib/rte_bbdev_def with a custom command 00:02:14.461 [76/740] Generating lib/rte_bbdev_mingw with a custom command 00:02:14.461 [77/740] Generating lib/rte_bitratestats_def with a custom command 00:02:14.461 [78/740] Generating lib/rte_bitratestats_mingw with a custom command 00:02:14.461 [79/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:14.461 [80/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:14.461 [81/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:14.461 [82/740] Linking static target lib/librte_pci.a 00:02:14.461 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:14.461 [84/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:14.461 [85/740] Generating lib/rte_bpf_def with a custom command 00:02:14.461 [86/740] Generating lib/rte_bpf_mingw with a custom command 00:02:14.461 [87/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:14.461 [88/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:14.461 [89/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:14.461 [90/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:14.723 [91/740] Linking static target lib/librte_meter.a 00:02:14.723 [92/740] Generating lib/rte_cfgfile_def with a custom command 00:02:14.723 [93/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:14.723 [94/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:14.723 [95/740] Generating lib/rte_cfgfile_mingw with a custom command 00:02:14.723 [96/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:14.723 [97/740] Generating lib/rte_compressdev_def with a custom command 00:02:14.723 [98/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:14.723 [99/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:14.723 [100/740] Generating lib/rte_compressdev_mingw with a custom command 00:02:14.723 [101/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:14.723 [102/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:14.723 [103/740] Linking static target lib/librte_ring.a 00:02:14.723 [104/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:14.723 [105/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:14.723 [106/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:02:14.723 [107/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:14.723 [108/740] Generating lib/rte_cryptodev_def with a custom command 00:02:14.723 [109/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:14.723 [110/740] Generating lib/rte_distributor_mingw with a custom command 00:02:14.723 [111/740] Generating lib/rte_cryptodev_mingw with a custom command 00:02:14.723 [112/740] Generating lib/rte_distributor_def with a custom command 00:02:14.723 [113/740] Generating lib/rte_efd_mingw with a custom command 00:02:14.723 [114/740] Generating lib/rte_efd_def with a custom command 00:02:14.723 [115/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:14.723 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:14.723 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:14.724 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:14.724 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:14.724 [120/740] Generating lib/rte_eventdev_def with a custom command 00:02:14.724 [121/740] Generating lib/rte_eventdev_mingw with a custom command 00:02:14.724 [122/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:14.724 [123/740] Generating lib/rte_gpudev_def with a custom command 00:02:14.724 [124/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:14.724 [125/740] Generating lib/rte_gpudev_mingw with a custom command 00:02:14.724 [126/740] Generating lib/rte_gro_def with a custom command 00:02:14.724 [127/740] Generating lib/rte_gro_mingw with a custom command 00:02:14.724 [128/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:14.724 [129/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:14.724 [130/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:14.724 [131/740] Generating lib/rte_gso_def with a custom command 00:02:14.724 [132/740] Generating lib/rte_gso_mingw with a custom command 00:02:14.724 [133/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:14.724 [134/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:14.985 [135/740] Generating lib/rte_ip_frag_def with a custom command 00:02:14.985 [136/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:14.985 [137/740] Generating lib/rte_ip_frag_mingw with a custom command 00:02:14.985 [138/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.985 [139/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.985 [140/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:14.985 [141/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:14.985 [142/740] Generating lib/rte_jobstats_def with a custom command 00:02:14.985 [143/740] Generating lib/rte_jobstats_mingw with a custom command 00:02:14.985 [144/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:14.985 [145/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:14.985 [146/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.985 [147/740] Generating lib/rte_latencystats_def with a custom command 00:02:14.985 [148/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:14.985 [149/740] Linking target lib/librte_kvargs.so.23.0 00:02:14.985 [150/740] Linking static target lib/librte_cfgfile.a 00:02:14.985 [151/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:14.985 [152/740] Generating lib/rte_latencystats_mingw with a custom command 00:02:14.985 [153/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:14.985 [154/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:14.985 [155/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:14.986 [156/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:14.986 [157/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:14.986 [158/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:14.986 [159/740] Generating lib/rte_lpm_def with a custom command 00:02:14.986 [160/740] Generating lib/rte_lpm_mingw with a custom command 00:02:14.986 [161/740] Generating lib/rte_member_def with a custom command 00:02:14.986 [162/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:14.986 [163/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:14.986 [164/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:14.986 [165/740] Generating lib/rte_member_mingw with a custom command 00:02:14.986 [166/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:14.986 [167/740] Generating lib/rte_pcapng_def with a custom command 00:02:14.986 [168/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:14.986 [169/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:14.986 [170/740] Generating lib/rte_pcapng_mingw with a custom command 00:02:14.986 [171/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:14.986 [172/740] Linking static target lib/librte_jobstats.a 00:02:14.986 [173/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.986 [174/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:15.247 [175/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:15.247 [176/740] Linking static target lib/librte_cmdline.a 00:02:15.247 [177/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:15.247 [178/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:15.247 [179/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:15.247 [180/740] Linking static target lib/librte_timer.a 00:02:15.247 [181/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:15.247 [182/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:15.247 [183/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:15.247 [184/740] Generating lib/rte_power_def with a custom command 00:02:15.247 [185/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:15.247 [186/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:15.247 [187/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:15.247 [188/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:15.247 [189/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:15.247 [190/740] Linking static target lib/librte_telemetry.a 00:02:15.247 [191/740] Linking static target lib/librte_metrics.a 00:02:15.247 [192/740] Generating lib/rte_rawdev_def with a custom command 00:02:15.247 [193/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:15.247 [194/740] Generating lib/rte_power_mingw with a custom command 00:02:15.247 [195/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:15.247 [196/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:15.247 [197/740] Generating lib/rte_rawdev_mingw with a custom command 00:02:15.247 [198/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:15.247 [199/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:15.247 [200/740] Generating lib/rte_regexdev_def with a custom command 00:02:15.247 [201/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:15.247 [202/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:15.247 [203/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:02:15.247 [204/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:15.248 [205/740] Generating lib/rte_dmadev_def with a custom command 00:02:15.248 [206/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:15.248 [207/740] Generating lib/rte_dmadev_mingw with a custom command 00:02:15.248 [208/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:15.248 [209/740] Generating lib/rte_regexdev_mingw with a custom command 00:02:15.248 [210/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:15.248 [211/740] Generating lib/rte_rib_mingw with a custom command 00:02:15.248 [212/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:15.248 [213/740] Generating lib/rte_rib_def with a custom command 00:02:15.248 [214/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:15.248 [215/740] Generating lib/rte_reorder_def with a custom command 00:02:15.248 [216/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:15.248 [217/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:15.248 [218/740] Generating lib/rte_reorder_mingw with a custom command 00:02:15.248 [219/740] Linking static target lib/librte_net.a 00:02:15.248 [220/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:15.248 [221/740] Linking static target lib/librte_bitratestats.a 00:02:15.248 [222/740] Generating lib/rte_sched_mingw with a custom command 00:02:15.248 [223/740] Generating lib/rte_sched_def with a custom command 00:02:15.248 [224/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:15.248 [225/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:15.248 [226/740] Generating lib/rte_security_def with a custom command 00:02:15.248 [227/740] Generating lib/rte_security_mingw with a custom command 00:02:15.248 [228/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:15.248 [229/740] Generating lib/rte_stack_def with a custom command 00:02:15.248 [230/740] Generating lib/rte_stack_mingw with a custom command 00:02:15.248 [231/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:15.248 [232/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:15.248 [233/740] Generating lib/rte_vhost_def with a custom command 00:02:15.248 [234/740] Generating lib/rte_vhost_mingw with a custom command 00:02:15.248 [235/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:15.248 [236/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:15.248 [237/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:15.248 [238/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:15.248 [239/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:15.248 [240/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:15.248 [241/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:15.510 [242/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:15.510 [243/740] Generating lib/rte_ipsec_mingw with a custom command 00:02:15.510 [244/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:15.510 [245/740] Generating lib/rte_ipsec_def with a custom command 00:02:15.510 [246/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:15.510 [247/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:15.510 [248/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:02:15.510 [249/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:15.510 [250/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:15.510 [251/740] Generating lib/rte_fib_def with a custom command 00:02:15.510 [252/740] Generating lib/rte_fib_mingw with a custom command 00:02:15.510 [253/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:15.510 [254/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:15.510 [255/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:15.510 [256/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:15.510 [257/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:15.510 [258/740] Linking static target lib/librte_stack.a 00:02:15.510 [259/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:15.510 [260/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:15.510 [261/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:15.510 [262/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:15.510 [263/740] Linking static target lib/librte_compressdev.a 00:02:15.510 [264/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:15.510 [265/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:15.510 [266/740] Generating lib/rte_port_def with a custom command 00:02:15.510 [267/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:15.510 [268/740] Generating lib/rte_pdump_def with a custom command 00:02:15.510 [269/740] Generating lib/rte_port_mingw with a custom command 00:02:15.510 [270/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:15.510 [271/740] Generating lib/rte_pdump_mingw with a custom command 00:02:15.510 [272/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:15.510 [273/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:15.510 [274/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:15.510 [275/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.510 [276/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:15.510 [277/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:15.510 [278/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:15.510 [279/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:15.510 [280/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:15.510 [281/740] Linking static target lib/librte_rcu.a 00:02:15.510 [282/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.510 [283/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:15.510 [284/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.510 [285/740] Linking static target lib/librte_mempool.a 00:02:15.775 [286/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:15.775 [287/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:15.775 [288/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:15.775 [289/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:15.775 [290/740] Linking static target lib/librte_rawdev.a 00:02:15.775 [291/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.775 [292/740] Linking static target lib/librte_bbdev.a 00:02:15.775 [293/740] Generating lib/rte_table_def with a custom command 00:02:15.775 [294/740] Generating lib/rte_table_mingw with a custom command 00:02:15.775 [295/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:15.775 [296/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:15.775 [297/740] Linking static target lib/librte_gro.a 00:02:15.775 [298/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:15.775 [299/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:15.775 [300/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:15.775 [301/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.775 [302/740] Linking static target lib/librte_dmadev.a 00:02:15.775 [303/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.775 [304/740] Linking static target lib/librte_gpudev.a 00:02:15.775 [305/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:15.775 [306/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:15.775 [307/740] Linking static target lib/librte_gso.a 00:02:15.775 [308/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:15.775 [309/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:15.775 [310/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:15.775 [311/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.775 [312/740] Generating lib/rte_pipeline_def with a custom command 00:02:15.775 [313/740] Generating lib/rte_pipeline_mingw with a custom command 00:02:15.775 [314/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:15.775 [315/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:15.775 [316/740] Linking static target lib/librte_latencystats.a 00:02:15.775 [317/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.775 [318/740] Linking target lib/librte_telemetry.so.23.0 00:02:15.775 [319/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:15.775 [320/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:15.775 [321/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:15.775 [322/740] Linking static target lib/librte_distributor.a 00:02:15.775 [323/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:02:15.775 [324/740] Generating lib/rte_graph_mingw with a custom command 00:02:15.775 [325/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:15.775 [326/740] Generating lib/rte_graph_def with a custom command 00:02:15.775 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:15.775 [328/740] Linking static target lib/librte_ip_frag.a 00:02:16.037 [329/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:16.037 [330/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:16.037 [331/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:16.037 [332/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:16.037 [333/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:16.037 [334/740] Generating lib/rte_node_def with a custom command 00:02:16.037 [335/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:16.037 [336/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:16.037 [337/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:16.037 [338/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:16.037 [339/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:16.037 [340/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:02:16.037 [341/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:16.037 [342/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:16.037 [343/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:16.037 [344/740] Linking static target lib/librte_regexdev.a 00:02:16.037 [345/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.037 [346/740] Generating lib/rte_node_mingw with a custom command 00:02:16.037 [347/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.037 [348/740] Linking static target lib/librte_eal.a 00:02:16.037 [349/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:16.037 [350/740] Generating drivers/rte_bus_pci_def with a custom command 00:02:16.037 [351/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:02:16.037 [352/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:16.037 [353/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.037 [354/740] Linking static target lib/librte_power.a 00:02:16.037 [355/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:16.037 [356/740] Generating drivers/rte_bus_vdev_def with a custom command 00:02:16.037 [357/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.037 [358/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:16.037 [359/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:16.037 [360/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:16.037 [361/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:16.037 [362/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:16.297 [363/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:02:16.297 [364/740] Linking static target lib/librte_reorder.a 00:02:16.297 [365/740] Generating drivers/rte_mempool_ring_def with a custom command 00:02:16.297 [366/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:02:16.297 [367/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:16.297 [368/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:16.297 [369/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:16.297 [370/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:16.297 [371/740] Linking static target lib/librte_security.a 00:02:16.297 [372/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:16.297 [373/740] Linking static target lib/librte_pcapng.a 00:02:16.297 [374/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:16.297 [375/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:16.297 [376/740] Linking static target lib/librte_bpf.a 00:02:16.297 [377/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.297 [378/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:16.297 [379/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:16.297 [380/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:16.297 [381/740] Linking static target lib/librte_mbuf.a 00:02:16.297 [382/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:16.297 [383/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:16.297 [384/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:16.297 [385/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:16.297 [386/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:16.297 [387/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.297 [388/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.297 [389/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:16.297 [390/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:02:16.297 [391/740] Generating drivers/rte_net_i40e_def with a custom command 00:02:16.297 [392/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:16.297 [393/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:16.297 [394/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:16.297 [395/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:16.297 [396/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:16.562 [397/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:16.562 [398/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:16.562 [399/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:16.562 [400/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:16.562 [401/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:16.562 [402/740] Linking static target lib/librte_lpm.a 00:02:16.562 [403/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:16.562 [404/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:16.562 [405/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:16.562 [406/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:16.562 [407/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:16.562 [408/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:16.562 [409/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:16.562 [410/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:16.562 [411/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:16.562 [412/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:16.562 [413/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:16.562 [414/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:16.562 [415/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:16.562 [416/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:16.562 [417/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:16.562 [418/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.562 [419/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.562 [420/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:16.562 [421/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:16.562 [422/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:16.562 [423/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:16.562 [424/740] Linking static target lib/librte_graph.a 00:02:16.562 [425/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:16.562 [426/740] Linking static target lib/librte_rib.a 00:02:16.562 [427/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:16.562 [428/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:16.562 [429/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:16.562 [430/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:16.562 [431/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:16.562 [432/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.823 [433/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:16.823 [434/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.823 [435/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:16.823 [436/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:16.823 [437/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.823 [438/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:16.823 [439/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.823 [440/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:16.823 [441/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:16.823 [442/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:16.823 [443/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:16.823 [444/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:16.823 [445/740] Linking static target lib/librte_efd.a 00:02:16.823 [446/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:16.823 [447/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:16.823 [448/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:16.823 [449/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:16.823 [450/740] Linking static target drivers/librte_bus_vdev.a 00:02:16.823 [451/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:16.823 [452/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:16.823 [453/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:17.090 [454/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:17.090 [455/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:17.090 [456/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.090 [457/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:17.090 [458/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:17.090 [459/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.090 [460/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:17.090 [461/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.090 [462/740] Linking static target lib/librte_fib.a 00:02:17.090 [463/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.090 [464/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:17.090 [465/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:17.090 [466/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:17.090 [467/740] Linking static target lib/librte_pdump.a 00:02:17.090 [468/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:17.090 [469/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:17.090 [470/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.354 [471/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:17.354 [472/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:17.354 [473/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.354 [474/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.354 [475/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:17.354 [476/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:17.354 [477/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:17.354 [478/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:17.354 [479/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:17.354 [480/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:17.354 [481/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.354 [482/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:17.354 [483/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:17.354 [484/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:17.354 [485/740] Linking static target drivers/librte_bus_pci.a 00:02:17.354 [486/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.354 [487/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:17.354 [488/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.354 [489/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:17.354 [490/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:17.354 [491/740] Linking static target lib/librte_table.a 00:02:17.354 [492/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:17.354 [493/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:17.615 [494/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:17.615 [495/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:17.615 [496/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:17.615 [497/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:17.615 [498/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:17.615 [499/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:17.615 [500/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:17.615 [501/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:17.615 [502/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:17.615 [503/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:17.615 [504/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:17.615 [505/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:17.615 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:17.615 [507/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.615 [508/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:17.615 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:17.615 [510/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:17.615 [511/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:17.615 [512/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.615 [513/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:17.615 [514/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:17.615 [515/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:17.615 [516/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:17.615 [517/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:17.615 [518/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:17.615 [519/740] Linking static target lib/librte_cryptodev.a 00:02:17.615 [520/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:17.615 [521/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:17.615 [522/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:17.615 [523/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:17.615 [524/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.875 [525/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:17.875 [526/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:17.875 [527/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:17.875 [528/740] Linking static target lib/librte_node.a 00:02:17.875 [529/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:17.875 [530/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:17.875 [531/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:17.875 [532/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:17.875 [533/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:17.875 [534/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.875 [535/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:17.875 [536/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:17.875 [537/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:17.875 [538/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:17.875 [539/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:17.875 [540/740] Linking static target lib/librte_ipsec.a 00:02:17.875 [541/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:17.875 [542/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:17.875 [543/740] Linking static target lib/librte_sched.a 00:02:17.875 [544/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:17.875 [545/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:17.875 [546/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:17.875 [547/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:17.875 [548/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.875 [549/740] Linking static target drivers/librte_mempool_ring.a 00:02:17.875 [550/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:17.875 [551/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:17.875 [552/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:17.875 [553/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:18.134 [554/740] Linking static target lib/librte_ethdev.a 00:02:18.134 [555/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:18.134 [556/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:18.134 [557/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:18.134 [558/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:18.134 [559/740] Linking static target lib/librte_member.a 00:02:18.134 [560/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.134 [561/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:18.134 [562/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:18.134 [563/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:18.134 [564/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:18.134 [565/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:18.134 [566/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:18.134 [567/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:18.134 [568/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:18.134 [569/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:18.134 [570/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:18.134 [571/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:18.134 [572/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:18.134 [573/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:18.134 [574/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:18.134 [575/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:18.392 [576/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:18.392 [577/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:18.392 [578/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:18.392 [579/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:18.392 [580/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:18.392 [581/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:18.392 [582/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:02:18.392 [583/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:18.392 [584/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:18.392 [585/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:18.392 [586/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.392 [587/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:18.392 [588/740] Linking static target lib/librte_port.a 00:02:18.392 [589/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:18.392 [590/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.392 [591/740] Linking static target lib/librte_eventdev.a 00:02:18.392 [592/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:18.392 [593/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:18.392 [594/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:18.392 [595/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:18.392 [596/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:18.392 [597/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:18.651 [598/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.651 [599/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:18.651 [600/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:18.651 [601/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:18.651 [602/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:18.651 [603/740] Linking static target lib/librte_hash.a 00:02:18.651 [604/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.651 [605/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:02:18.651 [606/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:18.651 [607/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:18.909 [608/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:18.909 [609/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:18.909 [610/740] Linking static target lib/librte_acl.a 00:02:18.909 [611/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:19.167 [612/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:19.167 [613/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:19.167 [614/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.426 [615/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.426 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:19.426 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:19.684 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.943 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:19.943 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:20.201 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:20.767 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:20.767 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:21.025 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:21.025 [625/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:21.025 [626/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:21.283 [627/740] Linking static target drivers/librte_net_i40e.a 00:02:21.283 [628/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.541 [629/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:21.799 [630/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.799 [631/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:21.799 [632/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:22.365 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.659 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.659 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:27.659 [636/740] Linking static target lib/librte_vhost.a 00:02:28.595 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:28.595 [638/740] Linking static target lib/librte_pipeline.a 00:02:28.853 [639/740] Linking target app/dpdk-proc-info 00:02:28.853 [640/740] Linking target app/dpdk-pdump 00:02:28.853 [641/740] Linking target app/dpdk-dumpcap 00:02:28.854 [642/740] Linking target app/dpdk-test-fib 00:02:28.854 [643/740] Linking target app/dpdk-test-bbdev 00:02:28.854 [644/740] Linking target app/dpdk-test-security-perf 00:02:28.854 [645/740] Linking target app/dpdk-test-gpudev 00:02:28.854 [646/740] Linking target app/dpdk-test-cmdline 00:02:28.854 [647/740] Linking target app/dpdk-test-sad 00:02:28.854 [648/740] Linking target app/dpdk-test-acl 00:02:28.854 [649/740] Linking target app/dpdk-test-pipeline 00:02:28.854 [650/740] Linking target app/dpdk-test-regex 00:02:28.854 [651/740] Linking target app/dpdk-test-compress-perf 00:02:28.854 [652/740] Linking target app/dpdk-test-flow-perf 00:02:28.854 [653/740] Linking target app/dpdk-test-crypto-perf 00:02:28.854 [654/740] Linking target app/dpdk-test-eventdev 00:02:28.854 [655/740] Linking target app/dpdk-testpmd 00:02:29.787 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.046 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.046 [658/740] Linking target lib/librte_eal.so.23.0 00:02:30.304 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:02:30.304 [660/740] Linking target lib/librte_meter.so.23.0 00:02:30.304 [661/740] Linking target lib/librte_rawdev.so.23.0 00:02:30.304 [662/740] Linking target lib/librte_ring.so.23.0 00:02:30.304 [663/740] Linking target lib/librte_pci.so.23.0 00:02:30.304 [664/740] Linking target lib/librte_timer.so.23.0 00:02:30.304 [665/740] Linking target lib/librte_cfgfile.so.23.0 00:02:30.304 [666/740] Linking target lib/librte_stack.so.23.0 00:02:30.304 [667/740] Linking target lib/librte_dmadev.so.23.0 00:02:30.304 [668/740] Linking target lib/librte_graph.so.23.0 00:02:30.304 [669/740] Linking target drivers/librte_bus_vdev.so.23.0 00:02:30.304 [670/740] Linking target lib/librte_acl.so.23.0 00:02:30.304 [671/740] Linking target lib/librte_jobstats.so.23.0 00:02:30.562 [672/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:02:30.562 [673/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:02:30.562 [674/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:02:30.562 [675/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:02:30.562 [676/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:02:30.562 [677/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:02:30.562 [678/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:02:30.562 [679/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:02:30.562 [680/740] Linking target lib/librte_mempool.so.23.0 00:02:30.562 [681/740] Linking target lib/librte_rcu.so.23.0 00:02:30.562 [682/740] Linking target drivers/librte_bus_pci.so.23.0 00:02:30.562 [683/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:02:30.562 [684/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:02:30.562 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:02:30.562 [686/740] Linking target lib/librte_mbuf.so.23.0 00:02:30.562 [687/740] Linking target drivers/librte_mempool_ring.so.23.0 00:02:30.562 [688/740] Linking target lib/librte_rib.so.23.0 00:02:30.820 [689/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:02:30.820 [690/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:02:30.820 [691/740] Linking target lib/librte_regexdev.so.23.0 00:02:30.820 [692/740] Linking target lib/librte_sched.so.23.0 00:02:30.820 [693/740] Linking target lib/librte_reorder.so.23.0 00:02:30.820 [694/740] Linking target lib/librte_bbdev.so.23.0 00:02:30.820 [695/740] Linking target lib/librte_compressdev.so.23.0 00:02:30.820 [696/740] Linking target lib/librte_net.so.23.0 00:02:30.820 [697/740] Linking target lib/librte_gpudev.so.23.0 00:02:30.820 [698/740] Linking target lib/librte_distributor.so.23.0 00:02:30.820 [699/740] Linking target lib/librte_cryptodev.so.23.0 00:02:30.820 [700/740] Linking target lib/librte_fib.so.23.0 00:02:31.079 [701/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:02:31.079 [702/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:02:31.079 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:02:31.079 [704/740] Linking target lib/librte_security.so.23.0 00:02:31.079 [705/740] Linking target lib/librte_hash.so.23.0 00:02:31.079 [706/740] Linking target lib/librte_cmdline.so.23.0 00:02:31.079 [707/740] Linking target lib/librte_ethdev.so.23.0 00:02:31.079 [708/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:02:31.079 [709/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:02:31.079 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:02:31.079 [711/740] Linking target lib/librte_lpm.so.23.0 00:02:31.079 [712/740] Linking target lib/librte_member.so.23.0 00:02:31.079 [713/740] Linking target lib/librte_efd.so.23.0 00:02:31.338 [714/740] Linking target lib/librte_ipsec.so.23.0 00:02:31.338 [715/740] Linking target lib/librte_pcapng.so.23.0 00:02:31.338 [716/740] Linking target lib/librte_metrics.so.23.0 00:02:31.338 [717/740] Linking target lib/librte_ip_frag.so.23.0 00:02:31.338 [718/740] Linking target lib/librte_gso.so.23.0 00:02:31.338 [719/740] Linking target lib/librte_power.so.23.0 00:02:31.338 [720/740] Linking target lib/librte_bpf.so.23.0 00:02:31.338 [721/740] Linking target lib/librte_gro.so.23.0 00:02:31.338 [722/740] Linking target lib/librte_vhost.so.23.0 00:02:31.338 [723/740] Linking target lib/librte_eventdev.so.23.0 00:02:31.338 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:02:31.338 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:02:31.338 [726/740] Linking target lib/librte_node.so.23.0 00:02:31.338 [727/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:02:31.338 [728/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:02:31.338 [729/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:02:31.338 [730/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:02:31.338 [731/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:02:31.338 [732/740] Linking target lib/librte_bitratestats.so.23.0 00:02:31.338 [733/740] Linking target lib/librte_latencystats.so.23.0 00:02:31.338 [734/740] Linking target lib/librte_pdump.so.23.0 00:02:31.596 [735/740] Linking target lib/librte_port.so.23.0 00:02:31.596 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:02:31.596 [737/740] Linking target lib/librte_table.so.23.0 00:02:31.854 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:33.231 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.231 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:33.231 01:18:19 build_native_dpdk -- common/autobuild_common.sh@194 -- $ uname -s 00:02:33.231 01:18:19 build_native_dpdk -- common/autobuild_common.sh@194 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:33.231 01:18:19 build_native_dpdk -- common/autobuild_common.sh@207 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:33.231 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:33.231 [0/1] Installing files. 00:02:33.494 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:33.495 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.496 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.497 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.498 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:33.499 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:33.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:33.760 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.760 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:33.761 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:33.761 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:33.761 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:33.761 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:33.761 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:33.762 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.026 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.027 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.028 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.029 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:34.030 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:34.030 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:34.030 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:34.030 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:34.030 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:34.030 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:34.030 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:34.030 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:34.030 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:34.030 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:34.030 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:34.030 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:34.030 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:34.030 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:34.030 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:34.030 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:34.030 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:34.030 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:34.030 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:34.030 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:34.030 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:34.030 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:34.030 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:34.030 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:34.030 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:34.030 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:34.030 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:34.030 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:34.030 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:34.030 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:34.030 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:34.030 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:34.030 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:34.030 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:34.030 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:34.030 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:34.030 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:34.030 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:34.030 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:34.030 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:34.030 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:34.030 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:34.030 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:34.030 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:34.030 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:34.030 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:34.030 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:34.030 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:34.030 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:34.030 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:34.030 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:34.030 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:34.030 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:34.030 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:34.030 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:34.030 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:34.030 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:34.030 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:34.030 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:34.030 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:34.030 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:34.030 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:34.030 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:34.030 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:34.030 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:34.031 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:34.031 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:34.031 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:34.031 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:34.031 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:34.031 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:34.031 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:34.031 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:34.031 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:34.031 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:34.031 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:34.031 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:34.031 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:34.031 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:34.031 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:34.031 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:34.031 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:34.031 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:34.031 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:34.031 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:34.031 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:34.031 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:34.031 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:34.031 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:34.031 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:34.031 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:34.031 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:34.031 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:34.031 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:34.031 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:34.031 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:34.031 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:34.031 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:34.031 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:34.031 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:34.031 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:34.031 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:34.031 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:34.031 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:34.031 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:34.031 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:34.031 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:34.031 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:34.031 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:34.031 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:34.031 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:34.031 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:34.031 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:34.031 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:34.031 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:34.031 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:34.031 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:34.031 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:34.031 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:34.031 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:34.031 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:34.031 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:34.031 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:34.031 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:34.031 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:34.031 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:34.031 01:18:19 build_native_dpdk -- common/autobuild_common.sh@213 -- $ cat 00:02:34.031 01:18:19 build_native_dpdk -- common/autobuild_common.sh@218 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:34.031 00:02:34.031 real 0m25.879s 00:02:34.031 user 6m37.193s 00:02:34.031 sys 2m15.467s 00:02:34.031 01:18:19 build_native_dpdk -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:34.031 01:18:19 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:34.031 ************************************ 00:02:34.031 END TEST build_native_dpdk 00:02:34.031 ************************************ 00:02:34.031 01:18:19 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:34.031 01:18:19 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:34.031 01:18:19 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:34.031 01:18:19 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:34.031 01:18:19 -- common/autobuild_common.sh@438 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:34.031 01:18:19 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:02:34.031 01:18:19 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:34.031 01:18:19 -- common/autotest_common.sh@10 -- $ set +x 00:02:34.031 ************************************ 00:02:34.031 START TEST autobuild_llvm_precompile 00:02:34.031 ************************************ 00:02:34.031 01:18:19 autobuild_llvm_precompile -- common/autotest_common.sh@1125 -- $ _llvm_precompile 00:02:34.031 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:34.031 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:34.031 Target: x86_64-redhat-linux-gnu 00:02:34.031 Thread model: posix 00:02:34.031 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:34.031 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:34.031 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:34.031 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:34.031 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:34.032 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:34.032 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:34.032 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:34.032 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:34.032 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:34.032 01:18:19 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:34.293 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:34.552 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:34.552 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:34.552 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:35.119 Using 'verbs' RDMA provider 00:02:50.932 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:03.138 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:03.396 Creating mk/config.mk...done. 00:03:03.396 Creating mk/cc.flags.mk...done. 00:03:03.396 Type 'make' to build. 00:03:03.396 00:03:03.396 real 0m29.266s 00:03:03.396 user 0m12.837s 00:03:03.396 sys 0m15.805s 00:03:03.396 01:18:49 autobuild_llvm_precompile -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:03:03.396 01:18:49 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:03:03.396 ************************************ 00:03:03.396 END TEST autobuild_llvm_precompile 00:03:03.396 ************************************ 00:03:03.396 01:18:49 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:03.396 01:18:49 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:03.396 01:18:49 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:03.396 01:18:49 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:03:03.396 01:18:49 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:03.655 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:03.655 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:03.655 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:03.912 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:04.170 Using 'verbs' RDMA provider 00:03:17.750 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:27.723 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:28.548 Creating mk/config.mk...done. 00:03:28.548 Creating mk/cc.flags.mk...done. 00:03:28.548 Type 'make' to build. 00:03:28.548 01:19:14 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:28.548 01:19:14 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:03:28.548 01:19:14 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:03:28.548 01:19:14 -- common/autotest_common.sh@10 -- $ set +x 00:03:28.548 ************************************ 00:03:28.548 START TEST make 00:03:28.548 ************************************ 00:03:28.548 01:19:14 make -- common/autotest_common.sh@1125 -- $ make -j112 00:03:28.806 make[1]: Nothing to be done for 'all'. 00:03:30.714 The Meson build system 00:03:30.714 Version: 1.5.0 00:03:30.714 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:30.714 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:30.714 Build type: native build 00:03:30.714 Project name: libvfio-user 00:03:30.714 Project version: 0.0.1 00:03:30.714 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:30.714 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:30.714 Host machine cpu family: x86_64 00:03:30.714 Host machine cpu: x86_64 00:03:30.714 Run-time dependency threads found: YES 00:03:30.714 Library dl found: YES 00:03:30.714 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:30.714 Run-time dependency json-c found: YES 0.17 00:03:30.714 Run-time dependency cmocka found: YES 1.1.7 00:03:30.714 Program pytest-3 found: NO 00:03:30.714 Program flake8 found: NO 00:03:30.714 Program misspell-fixer found: NO 00:03:30.714 Program restructuredtext-lint found: NO 00:03:30.714 Program valgrind found: YES (/usr/bin/valgrind) 00:03:30.714 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:30.714 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:30.714 Compiler for C supports arguments -Wwrite-strings: YES 00:03:30.714 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:30.714 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:30.714 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:30.714 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:30.714 Build targets in project: 8 00:03:30.714 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:30.714 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:30.714 00:03:30.714 libvfio-user 0.0.1 00:03:30.714 00:03:30.714 User defined options 00:03:30.714 buildtype : debug 00:03:30.714 default_library: static 00:03:30.714 libdir : /usr/local/lib 00:03:30.714 00:03:30.714 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:30.714 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:30.973 [1/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:30.973 [2/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:30.973 [3/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:30.973 [4/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:30.973 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:30.973 [6/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:30.973 [7/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:30.973 [8/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:30.973 [9/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:30.973 [10/36] Compiling C object samples/null.p/null.c.o 00:03:30.973 [11/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:30.973 [12/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:30.973 [13/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:30.973 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:30.973 [15/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:30.973 [16/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:30.973 [17/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:30.973 [18/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:30.973 [19/36] Compiling C object samples/server.p/server.c.o 00:03:30.973 [20/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:30.973 [21/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:30.973 [22/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:30.973 [23/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:30.973 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:30.973 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:30.973 [26/36] Compiling C object samples/client.p/client.c.o 00:03:30.973 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:30.973 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:30.973 [29/36] Linking static target lib/libvfio-user.a 00:03:30.973 [30/36] Linking target samples/client 00:03:30.973 [31/36] Linking target test/unit_tests 00:03:30.973 [32/36] Linking target samples/lspci 00:03:30.973 [33/36] Linking target samples/gpio-pci-idio-16 00:03:30.973 [34/36] Linking target samples/shadow_ioeventfd_server 00:03:30.973 [35/36] Linking target samples/null 00:03:30.973 [36/36] Linking target samples/server 00:03:30.973 INFO: autodetecting backend as ninja 00:03:30.973 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:31.231 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:31.491 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:31.491 ninja: no work to do. 00:03:43.702 CC lib/ut_mock/mock.o 00:03:43.702 CC lib/log/log_deprecated.o 00:03:43.702 CC lib/log/log.o 00:03:43.702 CC lib/log/log_flags.o 00:03:43.702 CC lib/ut/ut.o 00:03:43.702 LIB libspdk_ut_mock.a 00:03:43.702 LIB libspdk_log.a 00:03:43.702 LIB libspdk_ut.a 00:03:43.702 CC lib/ioat/ioat.o 00:03:43.702 CC lib/dma/dma.o 00:03:43.702 CC lib/util/base64.o 00:03:43.702 CXX lib/trace_parser/trace.o 00:03:43.702 CC lib/util/bit_array.o 00:03:43.702 CC lib/util/cpuset.o 00:03:43.702 CC lib/util/crc16.o 00:03:43.702 CC lib/util/crc32_ieee.o 00:03:43.702 CC lib/util/crc32.o 00:03:43.702 CC lib/util/crc32c.o 00:03:43.702 CC lib/util/crc64.o 00:03:43.702 CC lib/util/fd.o 00:03:43.702 CC lib/util/dif.o 00:03:43.702 CC lib/util/fd_group.o 00:03:43.702 CC lib/util/file.o 00:03:43.702 CC lib/util/hexlify.o 00:03:43.702 CC lib/util/iov.o 00:03:43.702 CC lib/util/math.o 00:03:43.702 CC lib/util/net.o 00:03:43.702 CC lib/util/pipe.o 00:03:43.702 CC lib/util/strerror_tls.o 00:03:43.702 CC lib/util/string.o 00:03:43.702 CC lib/util/uuid.o 00:03:43.702 CC lib/util/xor.o 00:03:43.702 CC lib/util/zipf.o 00:03:43.702 CC lib/util/md5.o 00:03:43.702 CC lib/vfio_user/host/vfio_user_pci.o 00:03:43.702 CC lib/vfio_user/host/vfio_user.o 00:03:43.702 LIB libspdk_dma.a 00:03:43.702 LIB libspdk_ioat.a 00:03:43.960 LIB libspdk_vfio_user.a 00:03:43.960 LIB libspdk_util.a 00:03:44.218 LIB libspdk_trace_parser.a 00:03:44.218 CC lib/vmd/led.o 00:03:44.218 CC lib/vmd/vmd.o 00:03:44.218 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:44.218 CC lib/rdma_provider/common.o 00:03:44.218 CC lib/conf/conf.o 00:03:44.218 CC lib/json/json_util.o 00:03:44.218 CC lib/json/json_write.o 00:03:44.218 CC lib/json/json_parse.o 00:03:44.218 CC lib/rdma_utils/rdma_utils.o 00:03:44.218 CC lib/env_dpdk/pci.o 00:03:44.218 CC lib/env_dpdk/env.o 00:03:44.218 CC lib/env_dpdk/memory.o 00:03:44.218 CC lib/env_dpdk/threads.o 00:03:44.218 CC lib/env_dpdk/init.o 00:03:44.218 CC lib/env_dpdk/pci_vmd.o 00:03:44.218 CC lib/env_dpdk/pci_ioat.o 00:03:44.218 CC lib/env_dpdk/pci_virtio.o 00:03:44.218 CC lib/idxd/idxd.o 00:03:44.218 CC lib/idxd/idxd_user.o 00:03:44.218 CC lib/env_dpdk/pci_idxd.o 00:03:44.218 CC lib/idxd/idxd_kernel.o 00:03:44.218 CC lib/env_dpdk/pci_event.o 00:03:44.218 CC lib/env_dpdk/sigbus_handler.o 00:03:44.218 CC lib/env_dpdk/pci_dpdk.o 00:03:44.218 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:44.218 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:44.476 LIB libspdk_rdma_provider.a 00:03:44.476 LIB libspdk_conf.a 00:03:44.476 LIB libspdk_json.a 00:03:44.476 LIB libspdk_rdma_utils.a 00:03:44.735 LIB libspdk_vmd.a 00:03:44.735 LIB libspdk_idxd.a 00:03:44.735 CC lib/jsonrpc/jsonrpc_server.o 00:03:44.735 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:44.735 CC lib/jsonrpc/jsonrpc_client.o 00:03:44.735 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:44.994 LIB libspdk_jsonrpc.a 00:03:45.253 LIB libspdk_env_dpdk.a 00:03:45.253 CC lib/rpc/rpc.o 00:03:45.512 LIB libspdk_rpc.a 00:03:45.771 CC lib/trace/trace.o 00:03:45.771 CC lib/trace/trace_flags.o 00:03:45.771 CC lib/trace/trace_rpc.o 00:03:45.771 CC lib/notify/notify.o 00:03:45.771 CC lib/notify/notify_rpc.o 00:03:45.771 CC lib/keyring/keyring.o 00:03:45.771 CC lib/keyring/keyring_rpc.o 00:03:45.771 LIB libspdk_notify.a 00:03:45.771 LIB libspdk_trace.a 00:03:45.771 LIB libspdk_keyring.a 00:03:46.030 CC lib/thread/thread.o 00:03:46.030 CC lib/thread/iobuf.o 00:03:46.030 CC lib/sock/sock.o 00:03:46.030 CC lib/sock/sock_rpc.o 00:03:46.288 LIB libspdk_sock.a 00:03:46.856 CC lib/nvme/nvme_ctrlr.o 00:03:46.856 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:46.856 CC lib/nvme/nvme_fabric.o 00:03:46.856 CC lib/nvme/nvme_ns_cmd.o 00:03:46.856 CC lib/nvme/nvme_ns.o 00:03:46.856 CC lib/nvme/nvme_pcie_common.o 00:03:46.856 CC lib/nvme/nvme_pcie.o 00:03:46.856 CC lib/nvme/nvme_qpair.o 00:03:46.856 CC lib/nvme/nvme.o 00:03:46.856 CC lib/nvme/nvme_discovery.o 00:03:46.856 CC lib/nvme/nvme_quirks.o 00:03:46.856 CC lib/nvme/nvme_transport.o 00:03:46.856 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:46.856 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:46.856 CC lib/nvme/nvme_tcp.o 00:03:46.856 CC lib/nvme/nvme_opal.o 00:03:46.856 CC lib/nvme/nvme_zns.o 00:03:46.856 CC lib/nvme/nvme_io_msg.o 00:03:46.856 CC lib/nvme/nvme_poll_group.o 00:03:46.856 CC lib/nvme/nvme_stubs.o 00:03:46.856 CC lib/nvme/nvme_auth.o 00:03:46.856 CC lib/nvme/nvme_cuse.o 00:03:46.856 CC lib/nvme/nvme_vfio_user.o 00:03:46.856 CC lib/nvme/nvme_rdma.o 00:03:46.856 LIB libspdk_thread.a 00:03:47.115 CC lib/vfu_tgt/tgt_rpc.o 00:03:47.115 CC lib/vfu_tgt/tgt_endpoint.o 00:03:47.115 CC lib/virtio/virtio.o 00:03:47.115 CC lib/virtio/virtio_vhost_user.o 00:03:47.115 CC lib/fsdev/fsdev.o 00:03:47.115 CC lib/accel/accel.o 00:03:47.115 CC lib/virtio/virtio_pci.o 00:03:47.115 CC lib/virtio/virtio_vfio_user.o 00:03:47.115 CC lib/fsdev/fsdev_io.o 00:03:47.115 CC lib/init/json_config.o 00:03:47.115 CC lib/accel/accel_rpc.o 00:03:47.115 CC lib/fsdev/fsdev_rpc.o 00:03:47.115 CC lib/accel/accel_sw.o 00:03:47.115 CC lib/init/rpc.o 00:03:47.115 CC lib/init/subsystem.o 00:03:47.115 CC lib/init/subsystem_rpc.o 00:03:47.115 CC lib/blob/blobstore.o 00:03:47.115 CC lib/blob/request.o 00:03:47.115 CC lib/blob/zeroes.o 00:03:47.115 CC lib/blob/blob_bs_dev.o 00:03:47.374 LIB libspdk_init.a 00:03:47.374 LIB libspdk_vfu_tgt.a 00:03:47.374 LIB libspdk_virtio.a 00:03:47.632 LIB libspdk_fsdev.a 00:03:47.632 CC lib/event/app.o 00:03:47.632 CC lib/event/reactor.o 00:03:47.632 CC lib/event/log_rpc.o 00:03:47.632 CC lib/event/app_rpc.o 00:03:47.632 CC lib/event/scheduler_static.o 00:03:47.891 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:47.891 LIB libspdk_event.a 00:03:47.891 LIB libspdk_accel.a 00:03:47.891 LIB libspdk_nvme.a 00:03:48.150 CC lib/bdev/bdev.o 00:03:48.150 CC lib/bdev/bdev_zone.o 00:03:48.150 CC lib/bdev/bdev_rpc.o 00:03:48.150 CC lib/bdev/part.o 00:03:48.150 CC lib/bdev/scsi_nvme.o 00:03:48.150 LIB libspdk_fuse_dispatcher.a 00:03:49.087 LIB libspdk_blob.a 00:03:49.087 CC lib/blobfs/blobfs.o 00:03:49.087 CC lib/blobfs/tree.o 00:03:49.087 CC lib/lvol/lvol.o 00:03:49.655 LIB libspdk_lvol.a 00:03:49.655 LIB libspdk_blobfs.a 00:03:49.913 LIB libspdk_bdev.a 00:03:50.177 CC lib/nvmf/ctrlr.o 00:03:50.177 CC lib/nvmf/ctrlr_discovery.o 00:03:50.177 CC lib/nvmf/ctrlr_bdev.o 00:03:50.177 CC lib/nvmf/nvmf.o 00:03:50.177 CC lib/nvmf/subsystem.o 00:03:50.177 CC lib/nvmf/stubs.o 00:03:50.177 CC lib/nvmf/nvmf_rpc.o 00:03:50.177 CC lib/nvmf/transport.o 00:03:50.177 CC lib/nvmf/mdns_server.o 00:03:50.177 CC lib/nvmf/tcp.o 00:03:50.177 CC lib/nvmf/vfio_user.o 00:03:50.177 CC lib/nvmf/rdma.o 00:03:50.177 CC lib/nvmf/auth.o 00:03:50.177 CC lib/nbd/nbd.o 00:03:50.177 CC lib/nbd/nbd_rpc.o 00:03:50.177 CC lib/scsi/dev.o 00:03:50.177 CC lib/scsi/lun.o 00:03:50.177 CC lib/scsi/port.o 00:03:50.177 CC lib/ublk/ublk.o 00:03:50.177 CC lib/scsi/scsi.o 00:03:50.177 CC lib/scsi/scsi_bdev.o 00:03:50.177 CC lib/ublk/ublk_rpc.o 00:03:50.177 CC lib/scsi/scsi_pr.o 00:03:50.177 CC lib/scsi/scsi_rpc.o 00:03:50.177 CC lib/scsi/task.o 00:03:50.177 CC lib/ftl/ftl_init.o 00:03:50.177 CC lib/ftl/ftl_core.o 00:03:50.177 CC lib/ftl/ftl_layout.o 00:03:50.177 CC lib/ftl/ftl_debug.o 00:03:50.177 CC lib/ftl/ftl_io.o 00:03:50.177 CC lib/ftl/ftl_sb.o 00:03:50.177 CC lib/ftl/ftl_l2p.o 00:03:50.177 CC lib/ftl/ftl_l2p_flat.o 00:03:50.177 CC lib/ftl/ftl_nv_cache.o 00:03:50.177 CC lib/ftl/ftl_band.o 00:03:50.177 CC lib/ftl/ftl_band_ops.o 00:03:50.177 CC lib/ftl/ftl_writer.o 00:03:50.177 CC lib/ftl/ftl_rq.o 00:03:50.177 CC lib/ftl/ftl_reloc.o 00:03:50.177 CC lib/ftl/ftl_l2p_cache.o 00:03:50.177 CC lib/ftl/ftl_p2l.o 00:03:50.177 CC lib/ftl/ftl_p2l_log.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:50.177 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:50.177 CC lib/ftl/utils/ftl_conf.o 00:03:50.177 CC lib/ftl/utils/ftl_mempool.o 00:03:50.177 CC lib/ftl/utils/ftl_bitmap.o 00:03:50.177 CC lib/ftl/utils/ftl_md.o 00:03:50.177 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:50.177 CC lib/ftl/utils/ftl_property.o 00:03:50.177 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:50.177 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:50.177 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:50.177 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:50.177 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:50.177 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:50.177 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:50.177 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:50.177 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:50.177 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:50.177 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:50.177 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:50.177 CC lib/ftl/base/ftl_base_dev.o 00:03:50.177 CC lib/ftl/base/ftl_base_bdev.o 00:03:50.177 CC lib/ftl/ftl_trace.o 00:03:50.438 LIB libspdk_scsi.a 00:03:50.438 LIB libspdk_nbd.a 00:03:50.697 LIB libspdk_ublk.a 00:03:50.697 CC lib/vhost/vhost_rpc.o 00:03:50.697 CC lib/vhost/vhost.o 00:03:50.697 CC lib/vhost/vhost_scsi.o 00:03:50.697 CC lib/vhost/rte_vhost_user.o 00:03:50.697 CC lib/vhost/vhost_blk.o 00:03:50.956 LIB libspdk_ftl.a 00:03:50.956 CC lib/iscsi/conn.o 00:03:50.956 CC lib/iscsi/init_grp.o 00:03:50.956 CC lib/iscsi/iscsi.o 00:03:50.956 CC lib/iscsi/param.o 00:03:50.956 CC lib/iscsi/portal_grp.o 00:03:50.956 CC lib/iscsi/tgt_node.o 00:03:50.956 CC lib/iscsi/task.o 00:03:50.956 CC lib/iscsi/iscsi_subsystem.o 00:03:50.956 CC lib/iscsi/iscsi_rpc.o 00:03:51.215 LIB libspdk_nvmf.a 00:03:51.473 LIB libspdk_vhost.a 00:03:51.473 LIB libspdk_iscsi.a 00:03:52.040 CC module/vfu_device/vfu_virtio.o 00:03:52.040 CC module/vfu_device/vfu_virtio_blk.o 00:03:52.040 CC module/vfu_device/vfu_virtio_scsi.o 00:03:52.040 CC module/vfu_device/vfu_virtio_rpc.o 00:03:52.040 CC module/vfu_device/vfu_virtio_fs.o 00:03:52.040 CC module/env_dpdk/env_dpdk_rpc.o 00:03:52.040 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:52.040 CC module/accel/ioat/accel_ioat.o 00:03:52.040 CC module/accel/ioat/accel_ioat_rpc.o 00:03:52.040 CC module/scheduler/gscheduler/gscheduler.o 00:03:52.040 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:52.040 CC module/accel/error/accel_error.o 00:03:52.040 CC module/sock/posix/posix.o 00:03:52.040 CC module/accel/error/accel_error_rpc.o 00:03:52.041 CC module/keyring/file/keyring.o 00:03:52.041 CC module/keyring/file/keyring_rpc.o 00:03:52.041 CC module/blob/bdev/blob_bdev.o 00:03:52.041 LIB libspdk_env_dpdk_rpc.a 00:03:52.041 CC module/fsdev/aio/fsdev_aio.o 00:03:52.041 CC module/fsdev/aio/linux_aio_mgr.o 00:03:52.041 CC module/accel/iaa/accel_iaa.o 00:03:52.041 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:52.041 CC module/accel/dsa/accel_dsa.o 00:03:52.041 CC module/accel/iaa/accel_iaa_rpc.o 00:03:52.041 CC module/accel/dsa/accel_dsa_rpc.o 00:03:52.041 CC module/keyring/linux/keyring.o 00:03:52.041 CC module/keyring/linux/keyring_rpc.o 00:03:52.298 LIB libspdk_scheduler_dpdk_governor.a 00:03:52.298 LIB libspdk_keyring_file.a 00:03:52.298 LIB libspdk_scheduler_gscheduler.a 00:03:52.298 LIB libspdk_accel_ioat.a 00:03:52.298 LIB libspdk_scheduler_dynamic.a 00:03:52.298 LIB libspdk_accel_error.a 00:03:52.298 LIB libspdk_keyring_linux.a 00:03:52.298 LIB libspdk_accel_iaa.a 00:03:52.298 LIB libspdk_blob_bdev.a 00:03:52.298 LIB libspdk_accel_dsa.a 00:03:52.298 LIB libspdk_vfu_device.a 00:03:52.555 LIB libspdk_sock_posix.a 00:03:52.555 LIB libspdk_fsdev_aio.a 00:03:52.813 CC module/bdev/aio/bdev_aio.o 00:03:52.813 CC module/bdev/aio/bdev_aio_rpc.o 00:03:52.813 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:52.813 CC module/bdev/malloc/bdev_malloc.o 00:03:52.813 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:52.813 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:52.813 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:52.813 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:52.813 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:52.813 CC module/bdev/nvme/bdev_nvme.o 00:03:52.813 CC module/bdev/delay/vbdev_delay.o 00:03:52.813 CC module/blobfs/bdev/blobfs_bdev.o 00:03:52.813 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:52.813 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:52.813 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:52.813 CC module/bdev/nvme/nvme_rpc.o 00:03:52.813 CC module/bdev/nvme/bdev_mdns_client.o 00:03:52.813 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:52.813 CC module/bdev/nvme/vbdev_opal.o 00:03:52.813 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:52.813 CC module/bdev/error/vbdev_error_rpc.o 00:03:52.813 CC module/bdev/error/vbdev_error.o 00:03:52.813 CC module/bdev/passthru/vbdev_passthru.o 00:03:52.813 CC module/bdev/raid/bdev_raid_sb.o 00:03:52.813 CC module/bdev/raid/bdev_raid.o 00:03:52.813 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:52.813 CC module/bdev/raid/bdev_raid_rpc.o 00:03:52.813 CC module/bdev/raid/concat.o 00:03:52.813 CC module/bdev/raid/raid0.o 00:03:52.813 CC module/bdev/raid/raid1.o 00:03:52.813 CC module/bdev/gpt/vbdev_gpt.o 00:03:52.813 CC module/bdev/gpt/gpt.o 00:03:52.813 CC module/bdev/null/bdev_null.o 00:03:52.813 CC module/bdev/lvol/vbdev_lvol.o 00:03:52.813 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:52.813 CC module/bdev/null/bdev_null_rpc.o 00:03:52.813 CC module/bdev/iscsi/bdev_iscsi.o 00:03:52.813 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:52.813 CC module/bdev/ftl/bdev_ftl.o 00:03:52.813 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:52.813 CC module/bdev/split/vbdev_split.o 00:03:52.813 CC module/bdev/split/vbdev_split_rpc.o 00:03:52.813 LIB libspdk_blobfs_bdev.a 00:03:53.071 LIB libspdk_bdev_error.a 00:03:53.071 LIB libspdk_bdev_split.a 00:03:53.071 LIB libspdk_bdev_gpt.a 00:03:53.071 LIB libspdk_bdev_null.a 00:03:53.071 LIB libspdk_bdev_zone_block.a 00:03:53.071 LIB libspdk_bdev_aio.a 00:03:53.071 LIB libspdk_bdev_passthru.a 00:03:53.071 LIB libspdk_bdev_ftl.a 00:03:53.071 LIB libspdk_bdev_malloc.a 00:03:53.071 LIB libspdk_bdev_delay.a 00:03:53.071 LIB libspdk_bdev_iscsi.a 00:03:53.071 LIB libspdk_bdev_virtio.a 00:03:53.071 LIB libspdk_bdev_lvol.a 00:03:53.330 LIB libspdk_bdev_raid.a 00:03:53.896 LIB libspdk_bdev_nvme.a 00:03:54.462 CC module/event/subsystems/keyring/keyring.o 00:03:54.462 CC module/event/subsystems/vmd/vmd.o 00:03:54.462 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:54.462 CC module/event/subsystems/iobuf/iobuf.o 00:03:54.462 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:54.462 CC module/event/subsystems/fsdev/fsdev.o 00:03:54.719 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:54.719 CC module/event/subsystems/sock/sock.o 00:03:54.719 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:54.719 CC module/event/subsystems/scheduler/scheduler.o 00:03:54.719 LIB libspdk_event_keyring.a 00:03:54.719 LIB libspdk_event_vmd.a 00:03:54.719 LIB libspdk_event_fsdev.a 00:03:54.719 LIB libspdk_event_vhost_blk.a 00:03:54.719 LIB libspdk_event_iobuf.a 00:03:54.719 LIB libspdk_event_scheduler.a 00:03:54.719 LIB libspdk_event_vfu_tgt.a 00:03:54.719 LIB libspdk_event_sock.a 00:03:54.978 CC module/event/subsystems/accel/accel.o 00:03:55.301 LIB libspdk_event_accel.a 00:03:55.597 CC module/event/subsystems/bdev/bdev.o 00:03:55.597 LIB libspdk_event_bdev.a 00:03:55.885 CC module/event/subsystems/ublk/ublk.o 00:03:55.885 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:55.885 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:55.885 CC module/event/subsystems/nbd/nbd.o 00:03:55.885 CC module/event/subsystems/scsi/scsi.o 00:03:55.885 LIB libspdk_event_ublk.a 00:03:55.885 LIB libspdk_event_nbd.a 00:03:56.182 LIB libspdk_event_scsi.a 00:03:56.182 LIB libspdk_event_nvmf.a 00:03:56.445 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:56.445 CC module/event/subsystems/iscsi/iscsi.o 00:03:56.445 LIB libspdk_event_vhost_scsi.a 00:03:56.445 LIB libspdk_event_iscsi.a 00:03:56.703 CC test/rpc_client/rpc_client_test.o 00:03:56.703 TEST_HEADER include/spdk/accel_module.h 00:03:56.703 TEST_HEADER include/spdk/assert.h 00:03:56.703 TEST_HEADER include/spdk/accel.h 00:03:56.703 TEST_HEADER include/spdk/barrier.h 00:03:56.703 CC app/spdk_top/spdk_top.o 00:03:56.703 TEST_HEADER include/spdk/bdev_module.h 00:03:56.703 TEST_HEADER include/spdk/base64.h 00:03:56.703 TEST_HEADER include/spdk/bdev.h 00:03:56.703 TEST_HEADER include/spdk/bdev_zone.h 00:03:56.703 CXX app/trace/trace.o 00:03:56.703 TEST_HEADER include/spdk/bit_array.h 00:03:56.703 TEST_HEADER include/spdk/blob_bdev.h 00:03:56.703 TEST_HEADER include/spdk/blobfs.h 00:03:56.703 TEST_HEADER include/spdk/bit_pool.h 00:03:56.703 TEST_HEADER include/spdk/blob.h 00:03:56.703 CC app/spdk_lspci/spdk_lspci.o 00:03:56.703 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:56.703 TEST_HEADER include/spdk/config.h 00:03:56.703 TEST_HEADER include/spdk/conf.h 00:03:56.703 CC app/spdk_nvme_discover/discovery_aer.o 00:03:56.703 TEST_HEADER include/spdk/cpuset.h 00:03:56.703 TEST_HEADER include/spdk/crc64.h 00:03:56.703 TEST_HEADER include/spdk/crc16.h 00:03:56.703 CC app/trace_record/trace_record.o 00:03:56.703 TEST_HEADER include/spdk/dif.h 00:03:56.703 TEST_HEADER include/spdk/crc32.h 00:03:56.703 TEST_HEADER include/spdk/dma.h 00:03:56.703 TEST_HEADER include/spdk/endian.h 00:03:56.703 TEST_HEADER include/spdk/env.h 00:03:56.703 TEST_HEADER include/spdk/event.h 00:03:56.703 TEST_HEADER include/spdk/fd_group.h 00:03:56.703 TEST_HEADER include/spdk/env_dpdk.h 00:03:56.703 TEST_HEADER include/spdk/fsdev.h 00:03:56.703 TEST_HEADER include/spdk/fd.h 00:03:56.703 TEST_HEADER include/spdk/file.h 00:03:56.703 TEST_HEADER include/spdk/fsdev_module.h 00:03:56.703 TEST_HEADER include/spdk/fuse_dispatcher.h 00:03:56.703 TEST_HEADER include/spdk/hexlify.h 00:03:56.703 TEST_HEADER include/spdk/ftl.h 00:03:56.703 TEST_HEADER include/spdk/histogram_data.h 00:03:56.703 TEST_HEADER include/spdk/gpt_spec.h 00:03:56.703 CC app/spdk_nvme_identify/identify.o 00:03:56.703 TEST_HEADER include/spdk/idxd.h 00:03:56.703 TEST_HEADER include/spdk/ioat_spec.h 00:03:56.703 TEST_HEADER include/spdk/ioat.h 00:03:56.703 TEST_HEADER include/spdk/idxd_spec.h 00:03:56.703 TEST_HEADER include/spdk/init.h 00:03:56.703 TEST_HEADER include/spdk/iscsi_spec.h 00:03:56.703 TEST_HEADER include/spdk/jsonrpc.h 00:03:56.703 TEST_HEADER include/spdk/json.h 00:03:56.703 CC app/spdk_nvme_perf/perf.o 00:03:56.703 TEST_HEADER include/spdk/keyring_module.h 00:03:56.703 TEST_HEADER include/spdk/keyring.h 00:03:56.703 TEST_HEADER include/spdk/lvol.h 00:03:56.703 TEST_HEADER include/spdk/log.h 00:03:56.703 TEST_HEADER include/spdk/likely.h 00:03:56.703 TEST_HEADER include/spdk/md5.h 00:03:56.703 TEST_HEADER include/spdk/mmio.h 00:03:56.703 TEST_HEADER include/spdk/memory.h 00:03:56.703 CC app/nvmf_tgt/nvmf_main.o 00:03:56.703 TEST_HEADER include/spdk/nbd.h 00:03:56.703 TEST_HEADER include/spdk/notify.h 00:03:56.703 TEST_HEADER include/spdk/nvme.h 00:03:56.703 TEST_HEADER include/spdk/net.h 00:03:56.703 TEST_HEADER include/spdk/nvme_intel.h 00:03:56.703 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:56.703 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:56.703 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:56.703 TEST_HEADER include/spdk/nvme_zns.h 00:03:56.703 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:56.703 TEST_HEADER include/spdk/nvme_spec.h 00:03:56.703 TEST_HEADER include/spdk/nvmf.h 00:03:56.703 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:56.703 TEST_HEADER include/spdk/nvmf_spec.h 00:03:56.703 TEST_HEADER include/spdk/nvmf_transport.h 00:03:56.703 TEST_HEADER include/spdk/opal.h 00:03:56.703 TEST_HEADER include/spdk/opal_spec.h 00:03:56.703 TEST_HEADER include/spdk/queue.h 00:03:56.703 TEST_HEADER include/spdk/pci_ids.h 00:03:56.703 TEST_HEADER include/spdk/pipe.h 00:03:56.703 CC app/spdk_dd/spdk_dd.o 00:03:56.703 TEST_HEADER include/spdk/rpc.h 00:03:56.703 TEST_HEADER include/spdk/reduce.h 00:03:56.703 TEST_HEADER include/spdk/scheduler.h 00:03:56.703 TEST_HEADER include/spdk/scsi_spec.h 00:03:56.703 TEST_HEADER include/spdk/scsi.h 00:03:56.703 TEST_HEADER include/spdk/sock.h 00:03:56.971 TEST_HEADER include/spdk/stdinc.h 00:03:56.971 TEST_HEADER include/spdk/string.h 00:03:56.971 TEST_HEADER include/spdk/thread.h 00:03:56.971 TEST_HEADER include/spdk/trace.h 00:03:56.971 TEST_HEADER include/spdk/trace_parser.h 00:03:56.971 TEST_HEADER include/spdk/tree.h 00:03:56.971 TEST_HEADER include/spdk/ublk.h 00:03:56.971 TEST_HEADER include/spdk/uuid.h 00:03:56.971 TEST_HEADER include/spdk/version.h 00:03:56.971 CC app/iscsi_tgt/iscsi_tgt.o 00:03:56.971 TEST_HEADER include/spdk/util.h 00:03:56.971 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:56.971 TEST_HEADER include/spdk/vhost.h 00:03:56.971 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:56.971 TEST_HEADER include/spdk/vmd.h 00:03:56.971 TEST_HEADER include/spdk/xor.h 00:03:56.971 CXX test/cpp_headers/accel.o 00:03:56.971 TEST_HEADER include/spdk/zipf.h 00:03:56.971 CXX test/cpp_headers/accel_module.o 00:03:56.971 CXX test/cpp_headers/barrier.o 00:03:56.971 CXX test/cpp_headers/assert.o 00:03:56.972 CXX test/cpp_headers/bdev.o 00:03:56.972 CXX test/cpp_headers/bdev_module.o 00:03:56.972 CXX test/cpp_headers/base64.o 00:03:56.972 CC test/env/pci/pci_ut.o 00:03:56.972 CXX test/cpp_headers/bdev_zone.o 00:03:56.972 CXX test/cpp_headers/blob_bdev.o 00:03:56.972 CC test/env/vtophys/vtophys.o 00:03:56.972 CXX test/cpp_headers/bit_pool.o 00:03:56.972 CXX test/cpp_headers/blobfs_bdev.o 00:03:56.972 CXX test/cpp_headers/bit_array.o 00:03:56.972 CXX test/cpp_headers/blobfs.o 00:03:56.972 CXX test/cpp_headers/blob.o 00:03:56.972 CC test/env/memory/memory_ut.o 00:03:56.972 CXX test/cpp_headers/config.o 00:03:56.972 CC app/spdk_tgt/spdk_tgt.o 00:03:56.972 CXX test/cpp_headers/conf.o 00:03:56.972 CXX test/cpp_headers/cpuset.o 00:03:56.972 CXX test/cpp_headers/crc32.o 00:03:56.972 CXX test/cpp_headers/crc16.o 00:03:56.972 CXX test/cpp_headers/dma.o 00:03:56.972 CXX test/cpp_headers/env_dpdk.o 00:03:56.972 CXX test/cpp_headers/dif.o 00:03:56.972 CXX test/cpp_headers/crc64.o 00:03:56.972 CXX test/cpp_headers/endian.o 00:03:56.972 CXX test/cpp_headers/event.o 00:03:56.972 CXX test/cpp_headers/env.o 00:03:56.972 CXX test/cpp_headers/fd_group.o 00:03:56.972 CXX test/cpp_headers/fd.o 00:03:56.972 CXX test/cpp_headers/file.o 00:03:56.972 CXX test/cpp_headers/fsdev_module.o 00:03:56.972 CXX test/cpp_headers/fsdev.o 00:03:56.972 CXX test/cpp_headers/ftl.o 00:03:56.972 CXX test/cpp_headers/fuse_dispatcher.o 00:03:56.972 CXX test/cpp_headers/hexlify.o 00:03:56.972 CXX test/cpp_headers/gpt_spec.o 00:03:56.972 CXX test/cpp_headers/histogram_data.o 00:03:56.972 CXX test/cpp_headers/idxd.o 00:03:56.972 CXX test/cpp_headers/init.o 00:03:56.972 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:56.972 CXX test/cpp_headers/idxd_spec.o 00:03:56.972 CXX test/cpp_headers/ioat.o 00:03:56.972 CXX test/cpp_headers/ioat_spec.o 00:03:56.972 CXX test/cpp_headers/iscsi_spec.o 00:03:56.972 CXX test/cpp_headers/json.o 00:03:56.972 CC examples/ioat/perf/perf.o 00:03:56.972 CXX test/cpp_headers/jsonrpc.o 00:03:56.972 CXX test/cpp_headers/keyring.o 00:03:56.972 CC test/thread/lock/spdk_lock.o 00:03:56.972 CXX test/cpp_headers/keyring_module.o 00:03:56.972 CXX test/cpp_headers/likely.o 00:03:56.972 CXX test/cpp_headers/lvol.o 00:03:56.972 CXX test/cpp_headers/log.o 00:03:56.972 CXX test/cpp_headers/md5.o 00:03:56.972 CXX test/cpp_headers/mmio.o 00:03:56.972 CXX test/cpp_headers/memory.o 00:03:56.972 CXX test/cpp_headers/nbd.o 00:03:56.972 CXX test/cpp_headers/net.o 00:03:56.972 CXX test/cpp_headers/nvme.o 00:03:56.972 CXX test/cpp_headers/notify.o 00:03:56.972 CXX test/cpp_headers/nvme_intel.o 00:03:56.972 CXX test/cpp_headers/nvme_ocssd.o 00:03:56.972 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:56.972 CC examples/util/zipf/zipf.o 00:03:56.972 CXX test/cpp_headers/nvme_zns.o 00:03:56.972 CXX test/cpp_headers/nvme_spec.o 00:03:56.972 CXX test/cpp_headers/nvmf_cmd.o 00:03:56.972 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:56.972 CXX test/cpp_headers/nvmf.o 00:03:56.972 CXX test/cpp_headers/nvmf_spec.o 00:03:56.972 CC test/thread/poller_perf/poller_perf.o 00:03:56.972 CXX test/cpp_headers/nvmf_transport.o 00:03:56.972 CXX test/cpp_headers/opal.o 00:03:56.972 CC test/app/jsoncat/jsoncat.o 00:03:56.972 CXX test/cpp_headers/opal_spec.o 00:03:56.972 CC app/fio/nvme/fio_plugin.o 00:03:56.972 CXX test/cpp_headers/pci_ids.o 00:03:56.972 CXX test/cpp_headers/pipe.o 00:03:56.972 CXX test/cpp_headers/reduce.o 00:03:56.972 CXX test/cpp_headers/queue.o 00:03:56.972 CXX test/cpp_headers/scheduler.o 00:03:56.972 CXX test/cpp_headers/rpc.o 00:03:56.972 CXX test/cpp_headers/scsi.o 00:03:56.972 CXX test/cpp_headers/scsi_spec.o 00:03:56.972 CXX test/cpp_headers/sock.o 00:03:56.972 CXX test/cpp_headers/stdinc.o 00:03:56.972 CXX test/cpp_headers/thread.o 00:03:56.972 CXX test/cpp_headers/string.o 00:03:56.972 CXX test/cpp_headers/trace.o 00:03:56.972 LINK spdk_lspci 00:03:56.972 CC examples/ioat/verify/verify.o 00:03:56.972 CC test/app/stub/stub.o 00:03:56.972 LINK rpc_client_test 00:03:56.972 CC test/app/histogram_perf/histogram_perf.o 00:03:56.972 CC test/dma/test_dma/test_dma.o 00:03:56.972 CC test/env/mem_callbacks/mem_callbacks.o 00:03:56.972 CC app/fio/bdev/fio_plugin.o 00:03:56.972 CC test/app/bdev_svc/bdev_svc.o 00:03:56.972 LINK spdk_nvme_discover 00:03:56.972 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:56.972 LINK spdk_trace_record 00:03:56.972 LINK interrupt_tgt 00:03:56.972 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:56.972 LINK nvmf_tgt 00:03:56.972 LINK vtophys 00:03:56.972 CXX test/cpp_headers/trace_parser.o 00:03:56.972 CXX test/cpp_headers/tree.o 00:03:56.972 LINK jsoncat 00:03:56.972 CXX test/cpp_headers/ublk.o 00:03:56.972 LINK env_dpdk_post_init 00:03:56.972 CXX test/cpp_headers/util.o 00:03:57.231 CXX test/cpp_headers/version.o 00:03:57.231 CXX test/cpp_headers/uuid.o 00:03:57.231 CXX test/cpp_headers/vfio_user_pci.o 00:03:57.231 CXX test/cpp_headers/vfio_user_spec.o 00:03:57.231 CXX test/cpp_headers/vhost.o 00:03:57.231 CXX test/cpp_headers/vmd.o 00:03:57.231 CXX test/cpp_headers/xor.o 00:03:57.231 CXX test/cpp_headers/zipf.o 00:03:57.231 LINK zipf 00:03:57.231 LINK poller_perf 00:03:57.231 LINK histogram_perf 00:03:57.231 LINK iscsi_tgt 00:03:57.231 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:57.231 LINK ioat_perf 00:03:57.231 LINK stub 00:03:57.231 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:57.231 LINK verify 00:03:57.231 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:57.231 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:57.231 LINK spdk_tgt 00:03:57.231 LINK bdev_svc 00:03:57.231 LINK spdk_trace 00:03:57.231 LINK mem_callbacks 00:03:57.231 LINK pci_ut 00:03:57.231 LINK spdk_dd 00:03:57.489 LINK test_dma 00:03:57.489 LINK nvme_fuzz 00:03:57.489 LINK llvm_vfio_fuzz 00:03:57.489 LINK spdk_nvme 00:03:57.489 LINK spdk_nvme_identify 00:03:57.489 LINK vhost_fuzz 00:03:57.489 LINK spdk_nvme_perf 00:03:57.489 LINK spdk_bdev 00:03:57.489 LINK spdk_top 00:03:57.747 LINK llvm_nvme_fuzz 00:03:57.748 LINK memory_ut 00:03:57.748 CC examples/vmd/lsvmd/lsvmd.o 00:03:57.748 CC examples/vmd/led/led.o 00:03:57.748 CC examples/idxd/perf/perf.o 00:03:57.748 CC examples/sock/hello_world/hello_sock.o 00:03:57.748 CC app/vhost/vhost.o 00:03:57.748 CC examples/thread/thread/thread_ex.o 00:03:57.748 LINK lsvmd 00:03:57.748 LINK led 00:03:58.006 LINK hello_sock 00:03:58.006 LINK vhost 00:03:58.006 LINK idxd_perf 00:03:58.006 LINK thread 00:03:58.006 LINK spdk_lock 00:03:58.006 LINK iscsi_fuzz 00:03:58.574 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:58.574 CC examples/nvme/hotplug/hotplug.o 00:03:58.574 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:58.574 CC examples/nvme/abort/abort.o 00:03:58.574 CC examples/nvme/hello_world/hello_world.o 00:03:58.574 CC examples/nvme/arbitration/arbitration.o 00:03:58.574 CC examples/nvme/reconnect/reconnect.o 00:03:58.574 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:58.574 CC test/event/reactor/reactor.o 00:03:58.574 CC test/event/event_perf/event_perf.o 00:03:58.574 CC test/event/reactor_perf/reactor_perf.o 00:03:58.574 CC test/event/app_repeat/app_repeat.o 00:03:58.832 CC test/event/scheduler/scheduler.o 00:03:58.832 LINK cmb_copy 00:03:58.832 LINK hotplug 00:03:58.832 LINK hello_world 00:03:58.832 LINK pmr_persistence 00:03:58.832 LINK reactor 00:03:58.832 LINK event_perf 00:03:58.832 LINK reactor_perf 00:03:58.832 LINK app_repeat 00:03:58.832 LINK reconnect 00:03:58.832 LINK abort 00:03:58.832 LINK arbitration 00:03:58.832 LINK nvme_manage 00:03:58.832 LINK scheduler 00:03:59.090 CC test/nvme/compliance/nvme_compliance.o 00:03:59.090 CC test/nvme/sgl/sgl.o 00:03:59.090 CC test/nvme/simple_copy/simple_copy.o 00:03:59.090 CC test/nvme/reserve/reserve.o 00:03:59.090 CC test/nvme/err_injection/err_injection.o 00:03:59.090 CC test/nvme/reset/reset.o 00:03:59.090 CC test/nvme/boot_partition/boot_partition.o 00:03:59.090 CC test/nvme/fdp/fdp.o 00:03:59.090 CC test/nvme/overhead/overhead.o 00:03:59.090 CC test/nvme/fused_ordering/fused_ordering.o 00:03:59.090 CC test/nvme/startup/startup.o 00:03:59.090 CC test/nvme/connect_stress/connect_stress.o 00:03:59.090 CC test/nvme/e2edp/nvme_dp.o 00:03:59.090 CC test/nvme/cuse/cuse.o 00:03:59.090 CC test/blobfs/mkfs/mkfs.o 00:03:59.090 CC test/nvme/aer/aer.o 00:03:59.090 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:59.090 CC test/accel/dif/dif.o 00:03:59.090 CC test/lvol/esnap/esnap.o 00:03:59.090 LINK boot_partition 00:03:59.090 LINK reserve 00:03:59.090 LINK err_injection 00:03:59.090 LINK startup 00:03:59.090 LINK connect_stress 00:03:59.090 LINK fused_ordering 00:03:59.090 LINK simple_copy 00:03:59.090 LINK doorbell_aers 00:03:59.090 LINK sgl 00:03:59.090 LINK reset 00:03:59.090 LINK mkfs 00:03:59.347 LINK nvme_dp 00:03:59.347 LINK aer 00:03:59.347 LINK fdp 00:03:59.347 LINK overhead 00:03:59.347 LINK nvme_compliance 00:03:59.605 LINK dif 00:03:59.605 CC examples/accel/perf/accel_perf.o 00:03:59.605 CC examples/blob/cli/blobcli.o 00:03:59.605 CC examples/blob/hello_world/hello_blob.o 00:03:59.605 CC examples/fsdev/hello_world/hello_fsdev.o 00:03:59.862 LINK hello_blob 00:03:59.862 LINK hello_fsdev 00:03:59.862 LINK cuse 00:03:59.862 LINK accel_perf 00:03:59.862 LINK blobcli 00:04:00.798 CC examples/bdev/bdevperf/bdevperf.o 00:04:00.798 CC examples/bdev/hello_world/hello_bdev.o 00:04:00.798 LINK hello_bdev 00:04:01.056 CC test/bdev/bdevio/bdevio.o 00:04:01.056 LINK bdevperf 00:04:01.315 LINK bdevio 00:04:02.694 LINK esnap 00:04:02.694 CC examples/nvmf/nvmf/nvmf.o 00:04:02.694 LINK nvmf 00:04:04.071 00:04:04.071 real 0m35.567s 00:04:04.071 user 4m36.460s 00:04:04.071 sys 1m40.125s 00:04:04.071 01:19:49 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:04:04.071 01:19:49 make -- common/autotest_common.sh@10 -- $ set +x 00:04:04.071 ************************************ 00:04:04.071 END TEST make 00:04:04.071 ************************************ 00:04:04.071 01:19:49 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:04.071 01:19:49 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:04.071 01:19:49 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:04.071 01:19:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:04.071 01:19:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:04.071 01:19:49 -- pm/common@44 -- $ pid=683469 00:04:04.071 01:19:49 -- pm/common@50 -- $ kill -TERM 683469 00:04:04.071 01:19:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:04.071 01:19:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:04.071 01:19:49 -- pm/common@44 -- $ pid=683471 00:04:04.071 01:19:49 -- pm/common@50 -- $ kill -TERM 683471 00:04:04.071 01:19:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:04.071 01:19:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:04.071 01:19:49 -- pm/common@44 -- $ pid=683473 00:04:04.071 01:19:49 -- pm/common@50 -- $ kill -TERM 683473 00:04:04.071 01:19:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:04.071 01:19:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:04.071 01:19:49 -- pm/common@44 -- $ pid=683500 00:04:04.071 01:19:49 -- pm/common@50 -- $ sudo -E kill -TERM 683500 00:04:04.330 01:19:50 -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:04.330 01:19:50 -- common/autotest_common.sh@1681 -- # lcov --version 00:04:04.330 01:19:50 -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:04.330 01:19:50 -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:04.330 01:19:50 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:04.331 01:19:50 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:04.331 01:19:50 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:04.331 01:19:50 -- scripts/common.sh@336 -- # IFS=.-: 00:04:04.331 01:19:50 -- scripts/common.sh@336 -- # read -ra ver1 00:04:04.331 01:19:50 -- scripts/common.sh@337 -- # IFS=.-: 00:04:04.331 01:19:50 -- scripts/common.sh@337 -- # read -ra ver2 00:04:04.331 01:19:50 -- scripts/common.sh@338 -- # local 'op=<' 00:04:04.331 01:19:50 -- scripts/common.sh@340 -- # ver1_l=2 00:04:04.331 01:19:50 -- scripts/common.sh@341 -- # ver2_l=1 00:04:04.331 01:19:50 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:04.331 01:19:50 -- scripts/common.sh@344 -- # case "$op" in 00:04:04.331 01:19:50 -- scripts/common.sh@345 -- # : 1 00:04:04.331 01:19:50 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:04.331 01:19:50 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:04.331 01:19:50 -- scripts/common.sh@365 -- # decimal 1 00:04:04.331 01:19:50 -- scripts/common.sh@353 -- # local d=1 00:04:04.331 01:19:50 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:04.331 01:19:50 -- scripts/common.sh@355 -- # echo 1 00:04:04.331 01:19:50 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:04.331 01:19:50 -- scripts/common.sh@366 -- # decimal 2 00:04:04.331 01:19:50 -- scripts/common.sh@353 -- # local d=2 00:04:04.331 01:19:50 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:04.331 01:19:50 -- scripts/common.sh@355 -- # echo 2 00:04:04.331 01:19:50 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:04.331 01:19:50 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:04.331 01:19:50 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:04.331 01:19:50 -- scripts/common.sh@368 -- # return 0 00:04:04.331 01:19:50 -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:04.331 01:19:50 -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:04.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:04.331 --rc genhtml_branch_coverage=1 00:04:04.331 --rc genhtml_function_coverage=1 00:04:04.331 --rc genhtml_legend=1 00:04:04.331 --rc geninfo_all_blocks=1 00:04:04.331 --rc geninfo_unexecuted_blocks=1 00:04:04.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:04.331 ' 00:04:04.331 01:19:50 -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:04.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:04.331 --rc genhtml_branch_coverage=1 00:04:04.331 --rc genhtml_function_coverage=1 00:04:04.331 --rc genhtml_legend=1 00:04:04.331 --rc geninfo_all_blocks=1 00:04:04.331 --rc geninfo_unexecuted_blocks=1 00:04:04.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:04.331 ' 00:04:04.331 01:19:50 -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:04.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:04.331 --rc genhtml_branch_coverage=1 00:04:04.331 --rc genhtml_function_coverage=1 00:04:04.331 --rc genhtml_legend=1 00:04:04.331 --rc geninfo_all_blocks=1 00:04:04.331 --rc geninfo_unexecuted_blocks=1 00:04:04.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:04.331 ' 00:04:04.331 01:19:50 -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:04.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:04.331 --rc genhtml_branch_coverage=1 00:04:04.331 --rc genhtml_function_coverage=1 00:04:04.331 --rc genhtml_legend=1 00:04:04.331 --rc geninfo_all_blocks=1 00:04:04.331 --rc geninfo_unexecuted_blocks=1 00:04:04.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:04.331 ' 00:04:04.331 01:19:50 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:04.331 01:19:50 -- nvmf/common.sh@7 -- # uname -s 00:04:04.331 01:19:50 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:04.331 01:19:50 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:04.331 01:19:50 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:04.331 01:19:50 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:04.331 01:19:50 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:04.331 01:19:50 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:04.331 01:19:50 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:04.331 01:19:50 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:04.331 01:19:50 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:04.331 01:19:50 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:04.331 01:19:50 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:04.331 01:19:50 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:04.331 01:19:50 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:04.331 01:19:50 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:04.331 01:19:50 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:04.331 01:19:50 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:04.331 01:19:50 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:04.331 01:19:50 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:04.331 01:19:50 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:04.331 01:19:50 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:04.331 01:19:50 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:04.331 01:19:50 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:04.331 01:19:50 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:04.331 01:19:50 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:04.331 01:19:50 -- paths/export.sh@5 -- # export PATH 00:04:04.331 01:19:50 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:04.331 01:19:50 -- nvmf/common.sh@51 -- # : 0 00:04:04.331 01:19:50 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:04.331 01:19:50 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:04.331 01:19:50 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:04.331 01:19:50 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:04.331 01:19:50 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:04.331 01:19:50 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:04.331 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:04.331 01:19:50 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:04.331 01:19:50 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:04.331 01:19:50 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:04.331 01:19:50 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:04.331 01:19:50 -- spdk/autotest.sh@32 -- # uname -s 00:04:04.331 01:19:50 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:04.331 01:19:50 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:04.331 01:19:50 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:04.331 01:19:50 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:04.331 01:19:50 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:04.331 01:19:50 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:04.331 01:19:50 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:04.331 01:19:50 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:04.331 01:19:50 -- spdk/autotest.sh@48 -- # udevadm_pid=761553 00:04:04.331 01:19:50 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:04.331 01:19:50 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:04.331 01:19:50 -- pm/common@17 -- # local monitor 00:04:04.331 01:19:50 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:04.331 01:19:50 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:04.331 01:19:50 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:04.331 01:19:50 -- pm/common@21 -- # date +%s 00:04:04.331 01:19:50 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:04.331 01:19:50 -- pm/common@21 -- # date +%s 00:04:04.331 01:19:50 -- pm/common@25 -- # sleep 1 00:04:04.331 01:19:50 -- pm/common@21 -- # date +%s 00:04:04.331 01:19:50 -- pm/common@21 -- # date +%s 00:04:04.331 01:19:50 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1734394790 00:04:04.331 01:19:50 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1734394790 00:04:04.331 01:19:50 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1734394790 00:04:04.331 01:19:50 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1734394790 00:04:04.331 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1734394790_collect-cpu-temp.pm.log 00:04:04.331 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1734394790_collect-vmstat.pm.log 00:04:04.331 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1734394790_collect-cpu-load.pm.log 00:04:04.331 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1734394790_collect-bmc-pm.bmc.pm.log 00:04:05.267 01:19:51 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:05.267 01:19:51 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:05.267 01:19:51 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:05.267 01:19:51 -- common/autotest_common.sh@10 -- # set +x 00:04:05.267 01:19:51 -- spdk/autotest.sh@59 -- # create_test_list 00:04:05.267 01:19:51 -- common/autotest_common.sh@748 -- # xtrace_disable 00:04:05.267 01:19:51 -- common/autotest_common.sh@10 -- # set +x 00:04:05.526 01:19:51 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:05.526 01:19:51 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:05.526 01:19:51 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:05.526 01:19:51 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:05.526 01:19:51 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:05.526 01:19:51 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:05.526 01:19:51 -- common/autotest_common.sh@1455 -- # uname 00:04:05.526 01:19:51 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:04:05.526 01:19:51 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:05.526 01:19:51 -- common/autotest_common.sh@1475 -- # uname 00:04:05.526 01:19:51 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:04:05.526 01:19:51 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:05.526 01:19:51 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:05.526 lcov: LCOV version 1.15 00:04:05.527 01:19:51 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:10.798 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:16.061 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:21.329 01:20:07 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:21.329 01:20:07 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:21.329 01:20:07 -- common/autotest_common.sh@10 -- # set +x 00:04:21.329 01:20:07 -- spdk/autotest.sh@78 -- # rm -f 00:04:21.329 01:20:07 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:24.613 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:24.613 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:24.613 01:20:10 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:24.613 01:20:10 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:24.613 01:20:10 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:24.614 01:20:10 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:24.614 01:20:10 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:24.614 01:20:10 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:24.614 01:20:10 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:24.614 01:20:10 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:24.614 01:20:10 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:24.614 01:20:10 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:24.614 01:20:10 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:24.614 01:20:10 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:24.614 01:20:10 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:24.614 01:20:10 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:24.614 01:20:10 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:24.872 No valid GPT data, bailing 00:04:24.872 01:20:10 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:24.872 01:20:10 -- scripts/common.sh@394 -- # pt= 00:04:24.872 01:20:10 -- scripts/common.sh@395 -- # return 1 00:04:24.872 01:20:10 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:24.872 1+0 records in 00:04:24.872 1+0 records out 00:04:24.872 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00176329 s, 595 MB/s 00:04:24.872 01:20:10 -- spdk/autotest.sh@105 -- # sync 00:04:24.872 01:20:10 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:24.872 01:20:10 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:24.872 01:20:10 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:32.978 01:20:17 -- spdk/autotest.sh@111 -- # uname -s 00:04:32.978 01:20:17 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:32.978 01:20:17 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:32.978 01:20:17 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:32.978 01:20:17 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:32.978 01:20:17 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:32.978 01:20:17 -- common/autotest_common.sh@10 -- # set +x 00:04:32.978 ************************************ 00:04:32.978 START TEST setup.sh 00:04:32.978 ************************************ 00:04:32.978 01:20:17 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:32.978 * Looking for test storage... 00:04:32.978 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:32.978 01:20:18 setup.sh -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:32.978 01:20:18 setup.sh -- common/autotest_common.sh@1681 -- # lcov --version 00:04:32.978 01:20:18 setup.sh -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:32.978 01:20:18 setup.sh -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:32.978 01:20:18 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:32.979 01:20:18 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:32.979 01:20:18 setup.sh -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:32.979 01:20:18 setup.sh -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:32.979 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:32.979 --rc genhtml_branch_coverage=1 00:04:32.979 --rc genhtml_function_coverage=1 00:04:32.979 --rc genhtml_legend=1 00:04:32.979 --rc geninfo_all_blocks=1 00:04:32.979 --rc geninfo_unexecuted_blocks=1 00:04:32.979 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:32.979 ' 00:04:32.979 01:20:18 setup.sh -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:32.979 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:32.979 --rc genhtml_branch_coverage=1 00:04:32.979 --rc genhtml_function_coverage=1 00:04:32.979 --rc genhtml_legend=1 00:04:32.979 --rc geninfo_all_blocks=1 00:04:32.979 --rc geninfo_unexecuted_blocks=1 00:04:32.979 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:32.979 ' 00:04:32.979 01:20:18 setup.sh -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:32.979 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:32.979 --rc genhtml_branch_coverage=1 00:04:32.979 --rc genhtml_function_coverage=1 00:04:32.979 --rc genhtml_legend=1 00:04:32.979 --rc geninfo_all_blocks=1 00:04:32.979 --rc geninfo_unexecuted_blocks=1 00:04:32.979 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:32.979 ' 00:04:32.979 01:20:18 setup.sh -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:32.979 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:32.979 --rc genhtml_branch_coverage=1 00:04:32.979 --rc genhtml_function_coverage=1 00:04:32.979 --rc genhtml_legend=1 00:04:32.979 --rc geninfo_all_blocks=1 00:04:32.979 --rc geninfo_unexecuted_blocks=1 00:04:32.979 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:32.979 ' 00:04:32.979 01:20:18 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:32.979 01:20:18 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:32.979 01:20:18 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:32.979 01:20:18 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:32.979 01:20:18 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:32.979 01:20:18 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:32.979 ************************************ 00:04:32.979 START TEST acl 00:04:32.979 ************************************ 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:32.979 * Looking for test storage... 00:04:32.979 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1681 -- # lcov --version 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:32.979 01:20:18 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:32.979 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:32.979 --rc genhtml_branch_coverage=1 00:04:32.979 --rc genhtml_function_coverage=1 00:04:32.979 --rc genhtml_legend=1 00:04:32.979 --rc geninfo_all_blocks=1 00:04:32.979 --rc geninfo_unexecuted_blocks=1 00:04:32.979 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:32.979 ' 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:32.979 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:32.979 --rc genhtml_branch_coverage=1 00:04:32.979 --rc genhtml_function_coverage=1 00:04:32.979 --rc genhtml_legend=1 00:04:32.979 --rc geninfo_all_blocks=1 00:04:32.979 --rc geninfo_unexecuted_blocks=1 00:04:32.979 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:32.979 ' 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:32.979 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:32.979 --rc genhtml_branch_coverage=1 00:04:32.979 --rc genhtml_function_coverage=1 00:04:32.979 --rc genhtml_legend=1 00:04:32.979 --rc geninfo_all_blocks=1 00:04:32.979 --rc geninfo_unexecuted_blocks=1 00:04:32.979 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:32.979 ' 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:32.979 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:32.979 --rc genhtml_branch_coverage=1 00:04:32.979 --rc genhtml_function_coverage=1 00:04:32.979 --rc genhtml_legend=1 00:04:32.979 --rc geninfo_all_blocks=1 00:04:32.979 --rc geninfo_unexecuted_blocks=1 00:04:32.979 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:32.979 ' 00:04:32.979 01:20:18 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:32.979 01:20:18 setup.sh.acl -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:32.979 01:20:18 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:32.979 01:20:18 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:32.979 01:20:18 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:32.979 01:20:18 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:32.979 01:20:18 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:32.979 01:20:18 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:32.979 01:20:18 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:36.260 01:20:21 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:36.260 01:20:21 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:36.260 01:20:21 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:36.260 01:20:21 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:36.260 01:20:21 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.260 01:20:21 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:39.543 Hugepages 00:04:39.543 node hugesize free / total 00:04:39.543 01:20:24 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:39.543 01:20:24 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:39.543 01:20:24 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 00:04:39.543 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.543 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:39.544 01:20:25 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:39.544 01:20:25 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:39.544 01:20:25 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:39.544 01:20:25 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:39.544 ************************************ 00:04:39.544 START TEST denied 00:04:39.544 ************************************ 00:04:39.544 01:20:25 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:04:39.544 01:20:25 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:39.544 01:20:25 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:39.544 01:20:25 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:39.544 01:20:25 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.544 01:20:25 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:42.824 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:42.824 01:20:28 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:47.013 00:04:47.013 real 0m7.546s 00:04:47.013 user 0m2.288s 00:04:47.013 sys 0m4.570s 00:04:47.013 01:20:32 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:47.013 01:20:32 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:47.013 ************************************ 00:04:47.013 END TEST denied 00:04:47.013 ************************************ 00:04:47.013 01:20:32 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:47.013 01:20:32 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:47.013 01:20:32 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:47.013 01:20:32 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:47.013 ************************************ 00:04:47.013 START TEST allowed 00:04:47.013 ************************************ 00:04:47.013 01:20:32 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:04:47.013 01:20:32 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:47.013 01:20:32 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:47.013 01:20:32 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:47.013 01:20:32 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.013 01:20:32 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:52.278 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:52.278 01:20:37 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:52.278 01:20:37 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:52.278 01:20:37 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:52.278 01:20:37 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:52.278 01:20:37 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:55.566 00:04:55.566 real 0m8.389s 00:04:55.566 user 0m2.261s 00:04:55.566 sys 0m4.540s 00:04:55.566 01:20:41 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:55.566 01:20:41 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:55.566 ************************************ 00:04:55.566 END TEST allowed 00:04:55.566 ************************************ 00:04:55.566 00:04:55.566 real 0m23.205s 00:04:55.566 user 0m7.143s 00:04:55.566 sys 0m14.028s 00:04:55.566 01:20:41 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:55.566 01:20:41 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:55.566 ************************************ 00:04:55.566 END TEST acl 00:04:55.566 ************************************ 00:04:55.566 01:20:41 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:55.566 01:20:41 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:55.566 01:20:41 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:55.566 01:20:41 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:55.566 ************************************ 00:04:55.566 START TEST hugepages 00:04:55.566 ************************************ 00:04:55.566 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:55.566 * Looking for test storage... 00:04:55.566 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:55.566 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:55.566 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # lcov --version 00:04:55.566 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:55.827 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:55.827 01:20:41 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:04:55.827 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:55.827 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:55.827 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:55.827 --rc genhtml_branch_coverage=1 00:04:55.827 --rc genhtml_function_coverage=1 00:04:55.827 --rc genhtml_legend=1 00:04:55.827 --rc geninfo_all_blocks=1 00:04:55.827 --rc geninfo_unexecuted_blocks=1 00:04:55.827 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:55.827 ' 00:04:55.827 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:55.827 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:55.827 --rc genhtml_branch_coverage=1 00:04:55.827 --rc genhtml_function_coverage=1 00:04:55.827 --rc genhtml_legend=1 00:04:55.827 --rc geninfo_all_blocks=1 00:04:55.827 --rc geninfo_unexecuted_blocks=1 00:04:55.827 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:55.827 ' 00:04:55.827 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:55.827 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:55.827 --rc genhtml_branch_coverage=1 00:04:55.827 --rc genhtml_function_coverage=1 00:04:55.827 --rc genhtml_legend=1 00:04:55.827 --rc geninfo_all_blocks=1 00:04:55.827 --rc geninfo_unexecuted_blocks=1 00:04:55.827 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:55.827 ' 00:04:55.827 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:55.827 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:55.827 --rc genhtml_branch_coverage=1 00:04:55.827 --rc genhtml_function_coverage=1 00:04:55.827 --rc genhtml_legend=1 00:04:55.827 --rc geninfo_all_blocks=1 00:04:55.827 --rc geninfo_unexecuted_blocks=1 00:04:55.827 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:55.827 ' 00:04:55.827 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:55.827 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:55.827 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:55.827 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:55.827 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:55.827 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 39274552 kB' 'MemAvailable: 42999748 kB' 'Buffers: 9316 kB' 'Cached: 12715424 kB' 'SwapCached: 0 kB' 'Active: 9541432 kB' 'Inactive: 3688880 kB' 'Active(anon): 9124948 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 508984 kB' 'Mapped: 157680 kB' 'Shmem: 8619376 kB' 'KReclaimable: 233848 kB' 'Slab: 912604 kB' 'SReclaimable: 233848 kB' 'SUnreclaim: 678756 kB' 'KernelStack: 21776 kB' 'PageTables: 7752 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 10315016 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214112 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.828 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:55.829 01:20:41 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:04:55.829 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:55.829 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:55.829 01:20:41 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:55.829 ************************************ 00:04:55.829 START TEST single_node_setup 00:04:55.829 ************************************ 00:04:55.829 01:20:41 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1125 -- # single_node_setup 00:04:55.829 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:04:55.829 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:04:55.829 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:55.829 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.830 01:20:41 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:59.117 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:59.117 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:59.117 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:59.117 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:59.117 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:59.117 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:59.374 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:00.751 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41448116 kB' 'MemAvailable: 45172752 kB' 'Buffers: 9316 kB' 'Cached: 12715552 kB' 'SwapCached: 0 kB' 'Active: 9544788 kB' 'Inactive: 3688880 kB' 'Active(anon): 9128304 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512308 kB' 'Mapped: 157844 kB' 'Shmem: 8619504 kB' 'KReclaimable: 232728 kB' 'Slab: 910112 kB' 'SReclaimable: 232728 kB' 'SUnreclaim: 677384 kB' 'KernelStack: 21920 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10318828 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214192 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.054 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.055 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41448360 kB' 'MemAvailable: 45172996 kB' 'Buffers: 9316 kB' 'Cached: 12715556 kB' 'SwapCached: 0 kB' 'Active: 9543612 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127128 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511044 kB' 'Mapped: 157728 kB' 'Shmem: 8619508 kB' 'KReclaimable: 232728 kB' 'Slab: 910052 kB' 'SReclaimable: 232728 kB' 'SUnreclaim: 677324 kB' 'KernelStack: 21856 kB' 'PageTables: 7624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10317728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214192 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.056 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.057 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41450472 kB' 'MemAvailable: 45175108 kB' 'Buffers: 9316 kB' 'Cached: 12715572 kB' 'SwapCached: 0 kB' 'Active: 9543192 kB' 'Inactive: 3688880 kB' 'Active(anon): 9126708 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511100 kB' 'Mapped: 157728 kB' 'Shmem: 8619524 kB' 'KReclaimable: 232728 kB' 'Slab: 910052 kB' 'SReclaimable: 232728 kB' 'SUnreclaim: 677324 kB' 'KernelStack: 21808 kB' 'PageTables: 7448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10319260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214192 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.058 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.059 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:01.060 nr_hugepages=1024 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:01.060 resv_hugepages=0 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:01.060 surplus_hugepages=0 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:01.060 anon_hugepages=0 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41451256 kB' 'MemAvailable: 45175892 kB' 'Buffers: 9316 kB' 'Cached: 12715604 kB' 'SwapCached: 0 kB' 'Active: 9543832 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127348 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511172 kB' 'Mapped: 157728 kB' 'Shmem: 8619556 kB' 'KReclaimable: 232728 kB' 'Slab: 910052 kB' 'SReclaimable: 232728 kB' 'SUnreclaim: 677324 kB' 'KernelStack: 21840 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10319780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214224 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.060 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.061 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 18785972 kB' 'MemUsed: 13848464 kB' 'SwapCached: 0 kB' 'Active: 6722480 kB' 'Inactive: 3572048 kB' 'Active(anon): 6474880 kB' 'Inactive(anon): 0 kB' 'Active(file): 247600 kB' 'Inactive(file): 3572048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9853160 kB' 'Mapped: 79792 kB' 'AnonPages: 444568 kB' 'Shmem: 6033512 kB' 'KernelStack: 13080 kB' 'PageTables: 5212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 109460 kB' 'Slab: 453048 kB' 'SReclaimable: 109460 kB' 'SUnreclaim: 343588 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.062 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:01.063 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:01.064 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:01.064 node0=1024 expecting 1024 00:05:01.064 01:20:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:01.064 00:05:01.064 real 0m5.276s 00:05:01.064 user 0m1.424s 00:05:01.064 sys 0m2.425s 00:05:01.064 01:20:46 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:01.064 01:20:46 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:05:01.064 ************************************ 00:05:01.064 END TEST single_node_setup 00:05:01.064 ************************************ 00:05:01.064 01:20:47 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:05:01.064 01:20:47 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:01.064 01:20:47 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:01.064 01:20:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:01.322 ************************************ 00:05:01.322 START TEST even_2G_alloc 00:05:01.322 ************************************ 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:01.322 01:20:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:04.616 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:04.616 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:04.616 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:04.616 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:04.616 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:04.616 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:04.617 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41437808 kB' 'MemAvailable: 45162444 kB' 'Buffers: 9316 kB' 'Cached: 12715712 kB' 'SwapCached: 0 kB' 'Active: 9543856 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127372 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511132 kB' 'Mapped: 156744 kB' 'Shmem: 8619664 kB' 'KReclaimable: 232728 kB' 'Slab: 909508 kB' 'SReclaimable: 232728 kB' 'SUnreclaim: 676780 kB' 'KernelStack: 21920 kB' 'PageTables: 7768 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10309696 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214528 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.617 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.618 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41435604 kB' 'MemAvailable: 45160240 kB' 'Buffers: 9316 kB' 'Cached: 12715716 kB' 'SwapCached: 0 kB' 'Active: 9544088 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127604 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511284 kB' 'Mapped: 156744 kB' 'Shmem: 8619668 kB' 'KReclaimable: 232728 kB' 'Slab: 909508 kB' 'SReclaimable: 232728 kB' 'SUnreclaim: 676780 kB' 'KernelStack: 22000 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10308348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.619 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.620 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41436604 kB' 'MemAvailable: 45161240 kB' 'Buffers: 9316 kB' 'Cached: 12715716 kB' 'SwapCached: 0 kB' 'Active: 9543632 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127148 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510764 kB' 'Mapped: 156664 kB' 'Shmem: 8619668 kB' 'KReclaimable: 232728 kB' 'Slab: 909456 kB' 'SReclaimable: 232728 kB' 'SUnreclaim: 676728 kB' 'KernelStack: 21872 kB' 'PageTables: 7872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10309872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.621 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:04.622 nr_hugepages=1024 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:04.622 resv_hugepages=0 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:04.622 surplus_hugepages=0 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:04.622 anon_hugepages=0 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:04.622 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41436176 kB' 'MemAvailable: 45160812 kB' 'Buffers: 9316 kB' 'Cached: 12715756 kB' 'SwapCached: 0 kB' 'Active: 9543032 kB' 'Inactive: 3688880 kB' 'Active(anon): 9126548 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510036 kB' 'Mapped: 156716 kB' 'Shmem: 8619708 kB' 'KReclaimable: 232728 kB' 'Slab: 909452 kB' 'SReclaimable: 232728 kB' 'SUnreclaim: 676724 kB' 'KernelStack: 21792 kB' 'PageTables: 7452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10309896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.623 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:04.624 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 19825160 kB' 'MemUsed: 12809276 kB' 'SwapCached: 0 kB' 'Active: 6721880 kB' 'Inactive: 3572048 kB' 'Active(anon): 6474280 kB' 'Inactive(anon): 0 kB' 'Active(file): 247600 kB' 'Inactive(file): 3572048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9853232 kB' 'Mapped: 79488 kB' 'AnonPages: 443816 kB' 'Shmem: 6033584 kB' 'KernelStack: 13096 kB' 'PageTables: 5184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 109452 kB' 'Slab: 452428 kB' 'SReclaimable: 109452 kB' 'SUnreclaim: 342976 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.625 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 21610140 kB' 'MemUsed: 6039220 kB' 'SwapCached: 0 kB' 'Active: 2821356 kB' 'Inactive: 116832 kB' 'Active(anon): 2652472 kB' 'Inactive(anon): 0 kB' 'Active(file): 168884 kB' 'Inactive(file): 116832 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2871876 kB' 'Mapped: 77228 kB' 'AnonPages: 66448 kB' 'Shmem: 2586160 kB' 'KernelStack: 8776 kB' 'PageTables: 2500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123268 kB' 'Slab: 457016 kB' 'SReclaimable: 123268 kB' 'SUnreclaim: 333748 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.626 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:04.627 node0=512 expecting 512 00:05:04.627 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:04.628 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:04.628 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:04.628 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:04.628 node1=512 expecting 512 00:05:04.628 01:20:50 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:04.628 00:05:04.628 real 0m3.310s 00:05:04.628 user 0m1.171s 00:05:04.628 sys 0m2.169s 00:05:04.628 01:20:50 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:04.628 01:20:50 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:04.628 ************************************ 00:05:04.628 END TEST even_2G_alloc 00:05:04.628 ************************************ 00:05:04.628 01:20:50 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:04.628 01:20:50 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:04.628 01:20:50 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:04.628 01:20:50 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:04.628 ************************************ 00:05:04.628 START TEST odd_alloc 00:05:04.628 ************************************ 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.628 01:20:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:07.915 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:07.915 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41455360 kB' 'MemAvailable: 45179992 kB' 'Buffers: 9316 kB' 'Cached: 12715892 kB' 'SwapCached: 0 kB' 'Active: 9542772 kB' 'Inactive: 3688880 kB' 'Active(anon): 9126288 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509668 kB' 'Mapped: 156764 kB' 'Shmem: 8619844 kB' 'KReclaimable: 232720 kB' 'Slab: 910856 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 678136 kB' 'KernelStack: 21728 kB' 'PageTables: 7444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10308132 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.915 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41454100 kB' 'MemAvailable: 45178732 kB' 'Buffers: 9316 kB' 'Cached: 12715896 kB' 'SwapCached: 0 kB' 'Active: 9542476 kB' 'Inactive: 3688880 kB' 'Active(anon): 9125992 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509408 kB' 'Mapped: 156700 kB' 'Shmem: 8619848 kB' 'KReclaimable: 232720 kB' 'Slab: 910856 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 678136 kB' 'KernelStack: 21712 kB' 'PageTables: 7420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10307776 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.916 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.917 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41454604 kB' 'MemAvailable: 45179236 kB' 'Buffers: 9316 kB' 'Cached: 12715896 kB' 'SwapCached: 0 kB' 'Active: 9542244 kB' 'Inactive: 3688880 kB' 'Active(anon): 9125760 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509172 kB' 'Mapped: 156700 kB' 'Shmem: 8619848 kB' 'KReclaimable: 232720 kB' 'Slab: 910856 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 678136 kB' 'KernelStack: 21712 kB' 'PageTables: 7392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10307932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.918 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.919 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:07.920 nr_hugepages=1025 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:07.920 resv_hugepages=0 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:07.920 surplus_hugepages=0 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:07.920 anon_hugepages=0 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41455008 kB' 'MemAvailable: 45179640 kB' 'Buffers: 9316 kB' 'Cached: 12715932 kB' 'SwapCached: 0 kB' 'Active: 9542244 kB' 'Inactive: 3688880 kB' 'Active(anon): 9125760 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509140 kB' 'Mapped: 156700 kB' 'Shmem: 8619884 kB' 'KReclaimable: 232720 kB' 'Slab: 910856 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 678136 kB' 'KernelStack: 21696 kB' 'PageTables: 7340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10307956 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.920 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.921 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 19831884 kB' 'MemUsed: 12802552 kB' 'SwapCached: 0 kB' 'Active: 6720676 kB' 'Inactive: 3572048 kB' 'Active(anon): 6473076 kB' 'Inactive(anon): 0 kB' 'Active(file): 247600 kB' 'Inactive(file): 3572048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9853300 kB' 'Mapped: 79472 kB' 'AnonPages: 442584 kB' 'Shmem: 6033652 kB' 'KernelStack: 12936 kB' 'PageTables: 4876 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 109452 kB' 'Slab: 453788 kB' 'SReclaimable: 109452 kB' 'SUnreclaim: 344336 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.922 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.923 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 21623124 kB' 'MemUsed: 6026236 kB' 'SwapCached: 0 kB' 'Active: 2821576 kB' 'Inactive: 116832 kB' 'Active(anon): 2652692 kB' 'Inactive(anon): 0 kB' 'Active(file): 168884 kB' 'Inactive(file): 116832 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2871988 kB' 'Mapped: 77228 kB' 'AnonPages: 66480 kB' 'Shmem: 2586272 kB' 'KernelStack: 8760 kB' 'PageTables: 2464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123268 kB' 'Slab: 457068 kB' 'SReclaimable: 123268 kB' 'SUnreclaim: 333800 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.924 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:07.925 node0=513 expecting 513 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:07.925 node1=512 expecting 512 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:07.925 00:05:07.925 real 0m3.301s 00:05:07.925 user 0m1.211s 00:05:07.925 sys 0m2.047s 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:07.925 01:20:53 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:07.925 ************************************ 00:05:07.925 END TEST odd_alloc 00:05:07.925 ************************************ 00:05:07.925 01:20:53 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:07.925 01:20:53 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:07.925 01:20:53 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:07.925 01:20:53 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:07.925 ************************************ 00:05:07.925 START TEST custom_alloc 00:05:07.925 ************************************ 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:07.925 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:07.926 01:20:53 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:11.267 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:11.267 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40459164 kB' 'MemAvailable: 44183796 kB' 'Buffers: 9316 kB' 'Cached: 12716068 kB' 'SwapCached: 0 kB' 'Active: 9544112 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127628 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510868 kB' 'Mapped: 156760 kB' 'Shmem: 8620020 kB' 'KReclaimable: 232720 kB' 'Slab: 910388 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 677668 kB' 'KernelStack: 21600 kB' 'PageTables: 7392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10309084 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.267 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:11.268 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40458924 kB' 'MemAvailable: 44183556 kB' 'Buffers: 9316 kB' 'Cached: 12716076 kB' 'SwapCached: 0 kB' 'Active: 9544104 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127620 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510896 kB' 'Mapped: 156728 kB' 'Shmem: 8620028 kB' 'KReclaimable: 232720 kB' 'Slab: 910420 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 677700 kB' 'KernelStack: 21680 kB' 'PageTables: 7388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10309100 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.269 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.270 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40458924 kB' 'MemAvailable: 44183556 kB' 'Buffers: 9316 kB' 'Cached: 12716076 kB' 'SwapCached: 0 kB' 'Active: 9544104 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127620 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510896 kB' 'Mapped: 156728 kB' 'Shmem: 8620028 kB' 'KReclaimable: 232720 kB' 'Slab: 910420 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 677700 kB' 'KernelStack: 21680 kB' 'PageTables: 7388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10309124 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.271 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:11.272 nr_hugepages=1536 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:11.272 resv_hugepages=0 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:11.272 surplus_hugepages=0 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:11.272 anon_hugepages=0 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:11.272 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40460372 kB' 'MemAvailable: 44185004 kB' 'Buffers: 9316 kB' 'Cached: 12716080 kB' 'SwapCached: 0 kB' 'Active: 9543912 kB' 'Inactive: 3688880 kB' 'Active(anon): 9127428 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510668 kB' 'Mapped: 156728 kB' 'Shmem: 8620032 kB' 'KReclaimable: 232720 kB' 'Slab: 910420 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 677700 kB' 'KernelStack: 21648 kB' 'PageTables: 7284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10309144 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.273 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.553 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:11.554 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 19850100 kB' 'MemUsed: 12784336 kB' 'SwapCached: 0 kB' 'Active: 6721852 kB' 'Inactive: 3572048 kB' 'Active(anon): 6474252 kB' 'Inactive(anon): 0 kB' 'Active(file): 247600 kB' 'Inactive(file): 3572048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9853344 kB' 'Mapped: 79480 kB' 'AnonPages: 443644 kB' 'Shmem: 6033696 kB' 'KernelStack: 12904 kB' 'PageTables: 4824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 109452 kB' 'Slab: 453436 kB' 'SReclaimable: 109452 kB' 'SUnreclaim: 343984 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.555 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 20615212 kB' 'MemUsed: 7034148 kB' 'SwapCached: 0 kB' 'Active: 2822232 kB' 'Inactive: 116832 kB' 'Active(anon): 2653348 kB' 'Inactive(anon): 0 kB' 'Active(file): 168884 kB' 'Inactive(file): 116832 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2872124 kB' 'Mapped: 77248 kB' 'AnonPages: 67080 kB' 'Shmem: 2586408 kB' 'KernelStack: 8760 kB' 'PageTables: 2512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123268 kB' 'Slab: 456984 kB' 'SReclaimable: 123268 kB' 'SUnreclaim: 333716 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.556 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:11.557 node0=512 expecting 512 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:11.557 node1=1024 expecting 1024 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:11.557 00:05:11.557 real 0m3.500s 00:05:11.557 user 0m1.323s 00:05:11.557 sys 0m2.221s 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:11.557 01:20:57 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:11.557 ************************************ 00:05:11.557 END TEST custom_alloc 00:05:11.557 ************************************ 00:05:11.557 01:20:57 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:11.557 01:20:57 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:11.557 01:20:57 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:11.557 01:20:57 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:11.557 ************************************ 00:05:11.557 START TEST no_shrink_alloc 00:05:11.557 ************************************ 00:05:11.557 01:20:57 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:05:11.557 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:11.557 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:11.557 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:11.558 01:20:57 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:14.874 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:14.874 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41467348 kB' 'MemAvailable: 45191980 kB' 'Buffers: 9316 kB' 'Cached: 12716224 kB' 'SwapCached: 0 kB' 'Active: 9551964 kB' 'Inactive: 3688880 kB' 'Active(anon): 9135480 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518648 kB' 'Mapped: 157444 kB' 'Shmem: 8620176 kB' 'KReclaimable: 232720 kB' 'Slab: 910868 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 678148 kB' 'KernelStack: 21792 kB' 'PageTables: 7628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10319896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214276 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.874 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.875 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41467448 kB' 'MemAvailable: 45192080 kB' 'Buffers: 9316 kB' 'Cached: 12716228 kB' 'SwapCached: 0 kB' 'Active: 9547152 kB' 'Inactive: 3688880 kB' 'Active(anon): 9130668 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513988 kB' 'Mapped: 157428 kB' 'Shmem: 8620180 kB' 'KReclaimable: 232720 kB' 'Slab: 910844 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 678124 kB' 'KernelStack: 21808 kB' 'PageTables: 7652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10315680 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.876 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.877 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41468676 kB' 'MemAvailable: 45193308 kB' 'Buffers: 9316 kB' 'Cached: 12716260 kB' 'SwapCached: 0 kB' 'Active: 9551696 kB' 'Inactive: 3688880 kB' 'Active(anon): 9135212 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518496 kB' 'Mapped: 157568 kB' 'Shmem: 8620212 kB' 'KReclaimable: 232720 kB' 'Slab: 910844 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 678124 kB' 'KernelStack: 21808 kB' 'PageTables: 7764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10320444 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214276 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.878 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.879 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:14.880 nr_hugepages=1024 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:14.880 resv_hugepages=0 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:14.880 surplus_hugepages=0 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:14.880 anon_hugepages=0 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41468180 kB' 'MemAvailable: 45192812 kB' 'Buffers: 9316 kB' 'Cached: 12716280 kB' 'SwapCached: 0 kB' 'Active: 9547444 kB' 'Inactive: 3688880 kB' 'Active(anon): 9130960 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514304 kB' 'Mapped: 157568 kB' 'Shmem: 8620232 kB' 'KReclaimable: 232720 kB' 'Slab: 910844 kB' 'SReclaimable: 232720 kB' 'SUnreclaim: 678124 kB' 'KernelStack: 21792 kB' 'PageTables: 7668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10317728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.880 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.881 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 18802604 kB' 'MemUsed: 13831832 kB' 'SwapCached: 0 kB' 'Active: 6723172 kB' 'Inactive: 3572048 kB' 'Active(anon): 6475572 kB' 'Inactive(anon): 0 kB' 'Active(file): 247600 kB' 'Inactive(file): 3572048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9853376 kB' 'Mapped: 80268 kB' 'AnonPages: 445104 kB' 'Shmem: 6033728 kB' 'KernelStack: 12984 kB' 'PageTables: 5000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 109452 kB' 'Slab: 453588 kB' 'SReclaimable: 109452 kB' 'SUnreclaim: 344136 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.882 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:14.883 node0=1024 expecting 1024 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.883 01:21:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:18.173 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:18.173 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:18.173 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41452636 kB' 'MemAvailable: 45177252 kB' 'Buffers: 9316 kB' 'Cached: 12716380 kB' 'SwapCached: 0 kB' 'Active: 9548704 kB' 'Inactive: 3688880 kB' 'Active(anon): 9132220 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515640 kB' 'Mapped: 156752 kB' 'Shmem: 8620332 kB' 'KReclaimable: 232688 kB' 'Slab: 910988 kB' 'SReclaimable: 232688 kB' 'SUnreclaim: 678300 kB' 'KernelStack: 22144 kB' 'PageTables: 8504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10317480 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214464 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.173 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.174 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41454504 kB' 'MemAvailable: 45179120 kB' 'Buffers: 9316 kB' 'Cached: 12716384 kB' 'SwapCached: 0 kB' 'Active: 9547824 kB' 'Inactive: 3688880 kB' 'Active(anon): 9131340 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514548 kB' 'Mapped: 156832 kB' 'Shmem: 8620336 kB' 'KReclaimable: 232688 kB' 'Slab: 910980 kB' 'SReclaimable: 232688 kB' 'SUnreclaim: 678292 kB' 'KernelStack: 22080 kB' 'PageTables: 8828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10318888 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.175 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.176 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41455060 kB' 'MemAvailable: 45179676 kB' 'Buffers: 9316 kB' 'Cached: 12716400 kB' 'SwapCached: 0 kB' 'Active: 9548720 kB' 'Inactive: 3688880 kB' 'Active(anon): 9132236 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514988 kB' 'Mapped: 156816 kB' 'Shmem: 8620352 kB' 'KReclaimable: 232688 kB' 'Slab: 910820 kB' 'SReclaimable: 232688 kB' 'SUnreclaim: 678132 kB' 'KernelStack: 22464 kB' 'PageTables: 9120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10317528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.177 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.178 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:18.179 nr_hugepages=1024 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:18.179 resv_hugepages=0 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:18.179 surplus_hugepages=0 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:18.179 anon_hugepages=0 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41454996 kB' 'MemAvailable: 45179612 kB' 'Buffers: 9316 kB' 'Cached: 12716424 kB' 'SwapCached: 0 kB' 'Active: 9548960 kB' 'Inactive: 3688880 kB' 'Active(anon): 9132476 kB' 'Inactive(anon): 0 kB' 'Active(file): 416484 kB' 'Inactive(file): 3688880 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515648 kB' 'Mapped: 156800 kB' 'Shmem: 8620376 kB' 'KReclaimable: 232688 kB' 'Slab: 910788 kB' 'SReclaimable: 232688 kB' 'SUnreclaim: 678100 kB' 'KernelStack: 22480 kB' 'PageTables: 9164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10316304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 76160 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 478580 kB' 'DirectMap2M: 12838912 kB' 'DirectMap1G: 56623104 kB' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.179 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.180 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:18.181 01:21:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 18783844 kB' 'MemUsed: 13850592 kB' 'SwapCached: 0 kB' 'Active: 6723708 kB' 'Inactive: 3572048 kB' 'Active(anon): 6476108 kB' 'Inactive(anon): 0 kB' 'Active(file): 247600 kB' 'Inactive(file): 3572048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9853388 kB' 'Mapped: 79584 kB' 'AnonPages: 445756 kB' 'Shmem: 6033740 kB' 'KernelStack: 13544 kB' 'PageTables: 6576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 109420 kB' 'Slab: 453312 kB' 'SReclaimable: 109420 kB' 'SUnreclaim: 343892 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.181 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:18.182 node0=1024 expecting 1024 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:18.182 00:05:18.182 real 0m6.601s 00:05:18.182 user 0m2.381s 00:05:18.182 sys 0m4.233s 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:18.182 01:21:04 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:18.182 ************************************ 00:05:18.182 END TEST no_shrink_alloc 00:05:18.182 ************************************ 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:18.182 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:18.183 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:18.183 01:21:04 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:18.183 00:05:18.183 real 0m22.659s 00:05:18.183 user 0m7.810s 00:05:18.183 sys 0m13.516s 00:05:18.183 01:21:04 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:18.183 01:21:04 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:18.183 ************************************ 00:05:18.183 END TEST hugepages 00:05:18.183 ************************************ 00:05:18.183 01:21:04 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:18.183 01:21:04 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:18.183 01:21:04 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:18.183 01:21:04 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:18.183 ************************************ 00:05:18.183 START TEST driver 00:05:18.183 ************************************ 00:05:18.183 01:21:04 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:18.442 * Looking for test storage... 00:05:18.442 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1681 -- # lcov --version 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:18.442 01:21:04 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:18.442 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.442 --rc genhtml_branch_coverage=1 00:05:18.442 --rc genhtml_function_coverage=1 00:05:18.442 --rc genhtml_legend=1 00:05:18.442 --rc geninfo_all_blocks=1 00:05:18.442 --rc geninfo_unexecuted_blocks=1 00:05:18.442 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:18.442 ' 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:18.442 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.442 --rc genhtml_branch_coverage=1 00:05:18.442 --rc genhtml_function_coverage=1 00:05:18.442 --rc genhtml_legend=1 00:05:18.442 --rc geninfo_all_blocks=1 00:05:18.442 --rc geninfo_unexecuted_blocks=1 00:05:18.442 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:18.442 ' 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:18.442 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.442 --rc genhtml_branch_coverage=1 00:05:18.442 --rc genhtml_function_coverage=1 00:05:18.442 --rc genhtml_legend=1 00:05:18.442 --rc geninfo_all_blocks=1 00:05:18.442 --rc geninfo_unexecuted_blocks=1 00:05:18.442 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:18.442 ' 00:05:18.442 01:21:04 setup.sh.driver -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:18.442 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.442 --rc genhtml_branch_coverage=1 00:05:18.442 --rc genhtml_function_coverage=1 00:05:18.442 --rc genhtml_legend=1 00:05:18.442 --rc geninfo_all_blocks=1 00:05:18.442 --rc geninfo_unexecuted_blocks=1 00:05:18.442 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:18.442 ' 00:05:18.442 01:21:04 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:18.442 01:21:04 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:18.442 01:21:04 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:23.705 01:21:09 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:23.705 01:21:09 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:23.705 01:21:09 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:23.705 01:21:09 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:23.705 ************************************ 00:05:23.705 START TEST guess_driver 00:05:23.705 ************************************ 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:23.705 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:23.705 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:23.705 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:23.705 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:23.705 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:23.705 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:23.705 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:23.705 Looking for driver=vfio-pci 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:23.705 01:21:09 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.995 01:21:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.375 01:21:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:28.375 01:21:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:28.375 01:21:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:28.376 01:21:14 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:28.376 01:21:14 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:28.376 01:21:14 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:28.376 01:21:14 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:33.645 00:05:33.646 real 0m9.872s 00:05:33.646 user 0m2.653s 00:05:33.646 sys 0m4.950s 00:05:33.646 01:21:18 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:33.646 01:21:18 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:33.646 ************************************ 00:05:33.646 END TEST guess_driver 00:05:33.646 ************************************ 00:05:33.646 00:05:33.646 real 0m14.790s 00:05:33.646 user 0m4.077s 00:05:33.646 sys 0m7.703s 00:05:33.646 01:21:18 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:33.646 01:21:18 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:33.646 ************************************ 00:05:33.646 END TEST driver 00:05:33.646 ************************************ 00:05:33.646 01:21:18 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:33.646 01:21:18 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:33.646 01:21:18 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:33.646 01:21:18 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:33.646 ************************************ 00:05:33.646 START TEST devices 00:05:33.646 ************************************ 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:33.646 * Looking for test storage... 00:05:33.646 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1681 -- # lcov --version 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:33.646 01:21:19 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:33.646 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.646 --rc genhtml_branch_coverage=1 00:05:33.646 --rc genhtml_function_coverage=1 00:05:33.646 --rc genhtml_legend=1 00:05:33.646 --rc geninfo_all_blocks=1 00:05:33.646 --rc geninfo_unexecuted_blocks=1 00:05:33.646 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.646 ' 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:33.646 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.646 --rc genhtml_branch_coverage=1 00:05:33.646 --rc genhtml_function_coverage=1 00:05:33.646 --rc genhtml_legend=1 00:05:33.646 --rc geninfo_all_blocks=1 00:05:33.646 --rc geninfo_unexecuted_blocks=1 00:05:33.646 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.646 ' 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:33.646 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.646 --rc genhtml_branch_coverage=1 00:05:33.646 --rc genhtml_function_coverage=1 00:05:33.646 --rc genhtml_legend=1 00:05:33.646 --rc geninfo_all_blocks=1 00:05:33.646 --rc geninfo_unexecuted_blocks=1 00:05:33.646 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.646 ' 00:05:33.646 01:21:19 setup.sh.devices -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:33.646 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.646 --rc genhtml_branch_coverage=1 00:05:33.646 --rc genhtml_function_coverage=1 00:05:33.646 --rc genhtml_legend=1 00:05:33.646 --rc geninfo_all_blocks=1 00:05:33.646 --rc geninfo_unexecuted_blocks=1 00:05:33.646 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.646 ' 00:05:33.646 01:21:19 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:33.646 01:21:19 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:33.646 01:21:19 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:33.646 01:21:19 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:36.935 01:21:22 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:36.935 01:21:22 setup.sh.devices -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:05:36.935 01:21:22 setup.sh.devices -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:05:36.935 01:21:22 setup.sh.devices -- common/autotest_common.sh@1656 -- # local nvme bdf 00:05:36.936 01:21:22 setup.sh.devices -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:05:36.936 01:21:22 setup.sh.devices -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:05:36.936 01:21:22 setup.sh.devices -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:05:36.936 01:21:22 setup.sh.devices -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:36.936 01:21:22 setup.sh.devices -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:36.936 01:21:22 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:36.936 01:21:22 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:36.936 No valid GPT data, bailing 00:05:36.936 01:21:22 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:36.936 01:21:22 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:36.936 01:21:22 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:36.936 01:21:22 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:36.936 01:21:22 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:36.936 01:21:22 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:36.936 01:21:22 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:36.936 01:21:22 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:36.936 01:21:22 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:36.936 01:21:22 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:36.936 ************************************ 00:05:36.936 START TEST nvme_mount 00:05:36.936 ************************************ 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:36.936 01:21:22 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:38.314 Creating new GPT entries in memory. 00:05:38.314 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:38.314 other utilities. 00:05:38.314 01:21:23 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:38.314 01:21:23 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:38.314 01:21:23 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:38.314 01:21:23 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:38.314 01:21:23 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:39.251 Creating new GPT entries in memory. 00:05:39.251 The operation has completed successfully. 00:05:39.251 01:21:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:39.251 01:21:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:39.251 01:21:24 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 793717 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:39.251 01:21:25 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:41.783 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:42.043 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:42.043 01:21:27 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:42.303 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:42.303 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:42.303 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:42.303 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:42.303 01:21:28 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.593 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:45.594 01:21:31 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.127 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.387 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:48.387 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:48.387 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:48.387 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:48.646 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:48.646 00:05:48.646 real 0m11.522s 00:05:48.646 user 0m3.133s 00:05:48.646 sys 0m6.175s 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:48.646 01:21:34 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:48.646 ************************************ 00:05:48.646 END TEST nvme_mount 00:05:48.646 ************************************ 00:05:48.646 01:21:34 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:48.646 01:21:34 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:48.646 01:21:34 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:48.646 01:21:34 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:48.646 ************************************ 00:05:48.646 START TEST dm_mount 00:05:48.646 ************************************ 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:48.646 01:21:34 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:49.584 Creating new GPT entries in memory. 00:05:49.584 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:49.584 other utilities. 00:05:49.584 01:21:35 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:49.584 01:21:35 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:49.584 01:21:35 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:49.584 01:21:35 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:49.584 01:21:35 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:50.966 Creating new GPT entries in memory. 00:05:50.966 The operation has completed successfully. 00:05:50.966 01:21:36 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:50.966 01:21:36 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:50.966 01:21:36 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:50.966 01:21:36 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:50.966 01:21:36 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:51.904 The operation has completed successfully. 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 797902 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:51.904 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:51.905 01:21:37 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.195 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:55.196 01:21:40 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:58.486 01:21:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:58.486 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:58.487 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:58.487 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:58.487 01:21:44 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:58.487 00:05:58.487 real 0m9.647s 00:05:58.487 user 0m2.302s 00:05:58.487 sys 0m4.391s 00:05:58.487 01:21:44 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:58.487 01:21:44 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:58.487 ************************************ 00:05:58.487 END TEST dm_mount 00:05:58.487 ************************************ 00:05:58.487 01:21:44 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:58.487 01:21:44 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:58.487 01:21:44 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:58.487 01:21:44 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:58.487 01:21:44 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:58.487 01:21:44 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:58.487 01:21:44 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:58.746 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:58.746 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:58.746 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:58.746 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:58.746 01:21:44 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:58.746 01:21:44 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:58.746 01:21:44 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:58.746 01:21:44 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:58.746 01:21:44 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:58.746 01:21:44 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:58.746 01:21:44 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:58.746 00:05:58.746 real 0m25.480s 00:05:58.746 user 0m6.909s 00:05:58.746 sys 0m13.306s 00:05:58.746 01:21:44 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:58.746 01:21:44 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:58.746 ************************************ 00:05:58.746 END TEST devices 00:05:58.746 ************************************ 00:05:58.746 00:05:58.746 real 1m26.653s 00:05:58.746 user 0m26.150s 00:05:58.746 sys 0m48.904s 00:05:58.746 01:21:44 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:58.746 01:21:44 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:58.746 ************************************ 00:05:58.746 END TEST setup.sh 00:05:58.746 ************************************ 00:05:58.746 01:21:44 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:02.026 Hugepages 00:06:02.026 node hugesize free / total 00:06:02.026 node0 1048576kB 0 / 0 00:06:02.026 node0 2048kB 1024 / 1024 00:06:02.026 node1 1048576kB 0 / 0 00:06:02.026 node1 2048kB 1024 / 1024 00:06:02.026 00:06:02.026 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:02.026 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:02.026 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:02.027 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:02.027 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:02.027 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:02.027 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:02.027 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:02.027 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:02.027 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:02.027 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:02.027 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:02.027 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:02.027 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:02.027 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:02.027 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:02.027 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:02.027 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:02.284 01:21:48 -- spdk/autotest.sh@117 -- # uname -s 00:06:02.284 01:21:48 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:02.284 01:21:48 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:02.284 01:21:48 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:05.563 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:05.563 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:07.464 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:07.464 01:21:53 -- common/autotest_common.sh@1515 -- # sleep 1 00:06:08.400 01:21:54 -- common/autotest_common.sh@1516 -- # bdfs=() 00:06:08.400 01:21:54 -- common/autotest_common.sh@1516 -- # local bdfs 00:06:08.400 01:21:54 -- common/autotest_common.sh@1518 -- # bdfs=($(get_nvme_bdfs)) 00:06:08.400 01:21:54 -- common/autotest_common.sh@1518 -- # get_nvme_bdfs 00:06:08.400 01:21:54 -- common/autotest_common.sh@1496 -- # bdfs=() 00:06:08.400 01:21:54 -- common/autotest_common.sh@1496 -- # local bdfs 00:06:08.400 01:21:54 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:08.400 01:21:54 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:08.400 01:21:54 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:06:08.400 01:21:54 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:06:08.400 01:21:54 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:d8:00.0 00:06:08.400 01:21:54 -- common/autotest_common.sh@1520 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:11.681 Waiting for block devices as requested 00:06:11.681 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:11.681 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:11.681 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:11.681 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:11.681 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:11.681 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:11.681 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:11.681 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:11.681 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:11.937 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:11.937 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:11.937 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:12.195 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:12.195 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:12.195 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:12.453 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:12.453 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:12.712 01:21:58 -- common/autotest_common.sh@1522 -- # for bdf in "${bdfs[@]}" 00:06:12.712 01:21:58 -- common/autotest_common.sh@1523 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:12.712 01:21:58 -- common/autotest_common.sh@1485 -- # grep 0000:d8:00.0/nvme/nvme 00:06:12.712 01:21:58 -- common/autotest_common.sh@1485 -- # readlink -f /sys/class/nvme/nvme0 00:06:12.712 01:21:58 -- common/autotest_common.sh@1485 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:12.712 01:21:58 -- common/autotest_common.sh@1486 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:12.712 01:21:58 -- common/autotest_common.sh@1490 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:12.712 01:21:58 -- common/autotest_common.sh@1490 -- # printf '%s\n' nvme0 00:06:12.712 01:21:58 -- common/autotest_common.sh@1523 -- # nvme_ctrlr=/dev/nvme0 00:06:12.712 01:21:58 -- common/autotest_common.sh@1524 -- # [[ -z /dev/nvme0 ]] 00:06:12.712 01:21:58 -- common/autotest_common.sh@1529 -- # nvme id-ctrl /dev/nvme0 00:06:12.712 01:21:58 -- common/autotest_common.sh@1529 -- # grep oacs 00:06:12.712 01:21:58 -- common/autotest_common.sh@1529 -- # cut -d: -f2 00:06:12.712 01:21:58 -- common/autotest_common.sh@1529 -- # oacs=' 0xe' 00:06:12.712 01:21:58 -- common/autotest_common.sh@1530 -- # oacs_ns_manage=8 00:06:12.712 01:21:58 -- common/autotest_common.sh@1532 -- # [[ 8 -ne 0 ]] 00:06:12.712 01:21:58 -- common/autotest_common.sh@1538 -- # nvme id-ctrl /dev/nvme0 00:06:12.712 01:21:58 -- common/autotest_common.sh@1538 -- # grep unvmcap 00:06:12.712 01:21:58 -- common/autotest_common.sh@1538 -- # cut -d: -f2 00:06:12.712 01:21:58 -- common/autotest_common.sh@1538 -- # unvmcap=' 0' 00:06:12.712 01:21:58 -- common/autotest_common.sh@1539 -- # [[ 0 -eq 0 ]] 00:06:12.712 01:21:58 -- common/autotest_common.sh@1541 -- # continue 00:06:12.712 01:21:58 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:12.712 01:21:58 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:12.712 01:21:58 -- common/autotest_common.sh@10 -- # set +x 00:06:12.712 01:21:58 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:12.712 01:21:58 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:12.712 01:21:58 -- common/autotest_common.sh@10 -- # set +x 00:06:12.712 01:21:58 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:15.996 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:15.996 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:17.370 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:17.627 01:22:03 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:17.627 01:22:03 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:17.627 01:22:03 -- common/autotest_common.sh@10 -- # set +x 00:06:17.627 01:22:03 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:17.627 01:22:03 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:06:17.627 01:22:03 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:06:17.628 01:22:03 -- common/autotest_common.sh@1561 -- # bdfs=() 00:06:17.628 01:22:03 -- common/autotest_common.sh@1561 -- # _bdfs=() 00:06:17.628 01:22:03 -- common/autotest_common.sh@1561 -- # local bdfs _bdfs 00:06:17.628 01:22:03 -- common/autotest_common.sh@1562 -- # _bdfs=($(get_nvme_bdfs)) 00:06:17.628 01:22:03 -- common/autotest_common.sh@1562 -- # get_nvme_bdfs 00:06:17.628 01:22:03 -- common/autotest_common.sh@1496 -- # bdfs=() 00:06:17.628 01:22:03 -- common/autotest_common.sh@1496 -- # local bdfs 00:06:17.628 01:22:03 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:17.628 01:22:03 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:17.628 01:22:03 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:06:17.628 01:22:03 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:06:17.628 01:22:03 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:d8:00.0 00:06:17.628 01:22:03 -- common/autotest_common.sh@1563 -- # for bdf in "${_bdfs[@]}" 00:06:17.628 01:22:03 -- common/autotest_common.sh@1564 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:17.628 01:22:03 -- common/autotest_common.sh@1564 -- # device=0x0a54 00:06:17.628 01:22:03 -- common/autotest_common.sh@1565 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:17.628 01:22:03 -- common/autotest_common.sh@1566 -- # bdfs+=($bdf) 00:06:17.628 01:22:03 -- common/autotest_common.sh@1570 -- # (( 1 > 0 )) 00:06:17.628 01:22:03 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:06:17.628 01:22:03 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:06:17.628 01:22:03 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=807423 00:06:17.628 01:22:03 -- common/autotest_common.sh@1583 -- # waitforlisten 807423 00:06:17.628 01:22:03 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:17.628 01:22:03 -- common/autotest_common.sh@831 -- # '[' -z 807423 ']' 00:06:17.628 01:22:03 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:17.628 01:22:03 -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:17.628 01:22:03 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:17.628 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:17.628 01:22:03 -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:17.628 01:22:03 -- common/autotest_common.sh@10 -- # set +x 00:06:17.628 [2024-12-17 01:22:03.601136] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:17.628 [2024-12-17 01:22:03.601206] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid807423 ] 00:06:17.885 [2024-12-17 01:22:03.668474] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.885 [2024-12-17 01:22:03.706878] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.143 01:22:03 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:18.143 01:22:03 -- common/autotest_common.sh@864 -- # return 0 00:06:18.143 01:22:03 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:06:18.143 01:22:03 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:06:18.143 01:22:03 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:21.421 nvme0n1 00:06:21.421 01:22:06 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:21.421 [2024-12-17 01:22:07.093035] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:21.421 request: 00:06:21.421 { 00:06:21.421 "nvme_ctrlr_name": "nvme0", 00:06:21.421 "password": "test", 00:06:21.421 "method": "bdev_nvme_opal_revert", 00:06:21.421 "req_id": 1 00:06:21.421 } 00:06:21.421 Got JSON-RPC error response 00:06:21.421 response: 00:06:21.421 { 00:06:21.421 "code": -32602, 00:06:21.421 "message": "Invalid parameters" 00:06:21.421 } 00:06:21.421 01:22:07 -- common/autotest_common.sh@1589 -- # true 00:06:21.421 01:22:07 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:06:21.421 01:22:07 -- common/autotest_common.sh@1593 -- # killprocess 807423 00:06:21.421 01:22:07 -- common/autotest_common.sh@950 -- # '[' -z 807423 ']' 00:06:21.421 01:22:07 -- common/autotest_common.sh@954 -- # kill -0 807423 00:06:21.421 01:22:07 -- common/autotest_common.sh@955 -- # uname 00:06:21.421 01:22:07 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:21.421 01:22:07 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 807423 00:06:21.421 01:22:07 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:21.421 01:22:07 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:21.421 01:22:07 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 807423' 00:06:21.421 killing process with pid 807423 00:06:21.421 01:22:07 -- common/autotest_common.sh@969 -- # kill 807423 00:06:21.421 01:22:07 -- common/autotest_common.sh@974 -- # wait 807423 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.421 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:21.422 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:23.319 01:22:09 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:23.319 01:22:09 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:23.319 01:22:09 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:23.319 01:22:09 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:23.319 01:22:09 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:23.319 01:22:09 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:23.319 01:22:09 -- common/autotest_common.sh@10 -- # set +x 00:06:23.319 01:22:09 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:23.319 01:22:09 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:23.319 01:22:09 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:23.319 01:22:09 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:23.319 01:22:09 -- common/autotest_common.sh@10 -- # set +x 00:06:23.319 ************************************ 00:06:23.319 START TEST env 00:06:23.319 ************************************ 00:06:23.319 01:22:09 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:23.577 * Looking for test storage... 00:06:23.577 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:23.577 01:22:09 env -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:23.577 01:22:09 env -- common/autotest_common.sh@1681 -- # lcov --version 00:06:23.577 01:22:09 env -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:23.577 01:22:09 env -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:23.578 01:22:09 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:23.578 01:22:09 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:23.578 01:22:09 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:23.578 01:22:09 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:23.578 01:22:09 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:23.578 01:22:09 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:23.578 01:22:09 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:23.578 01:22:09 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:23.578 01:22:09 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:23.578 01:22:09 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:23.578 01:22:09 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:23.578 01:22:09 env -- scripts/common.sh@344 -- # case "$op" in 00:06:23.578 01:22:09 env -- scripts/common.sh@345 -- # : 1 00:06:23.578 01:22:09 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:23.578 01:22:09 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:23.578 01:22:09 env -- scripts/common.sh@365 -- # decimal 1 00:06:23.578 01:22:09 env -- scripts/common.sh@353 -- # local d=1 00:06:23.578 01:22:09 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:23.578 01:22:09 env -- scripts/common.sh@355 -- # echo 1 00:06:23.578 01:22:09 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:23.578 01:22:09 env -- scripts/common.sh@366 -- # decimal 2 00:06:23.578 01:22:09 env -- scripts/common.sh@353 -- # local d=2 00:06:23.578 01:22:09 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:23.578 01:22:09 env -- scripts/common.sh@355 -- # echo 2 00:06:23.578 01:22:09 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:23.578 01:22:09 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:23.578 01:22:09 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:23.578 01:22:09 env -- scripts/common.sh@368 -- # return 0 00:06:23.578 01:22:09 env -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:23.578 01:22:09 env -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:23.578 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.578 --rc genhtml_branch_coverage=1 00:06:23.578 --rc genhtml_function_coverage=1 00:06:23.578 --rc genhtml_legend=1 00:06:23.578 --rc geninfo_all_blocks=1 00:06:23.578 --rc geninfo_unexecuted_blocks=1 00:06:23.578 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.578 ' 00:06:23.578 01:22:09 env -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:23.578 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.578 --rc genhtml_branch_coverage=1 00:06:23.578 --rc genhtml_function_coverage=1 00:06:23.578 --rc genhtml_legend=1 00:06:23.578 --rc geninfo_all_blocks=1 00:06:23.578 --rc geninfo_unexecuted_blocks=1 00:06:23.578 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.578 ' 00:06:23.578 01:22:09 env -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:23.578 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.578 --rc genhtml_branch_coverage=1 00:06:23.578 --rc genhtml_function_coverage=1 00:06:23.578 --rc genhtml_legend=1 00:06:23.578 --rc geninfo_all_blocks=1 00:06:23.578 --rc geninfo_unexecuted_blocks=1 00:06:23.578 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.578 ' 00:06:23.578 01:22:09 env -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:23.578 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:23.578 --rc genhtml_branch_coverage=1 00:06:23.578 --rc genhtml_function_coverage=1 00:06:23.578 --rc genhtml_legend=1 00:06:23.578 --rc geninfo_all_blocks=1 00:06:23.578 --rc geninfo_unexecuted_blocks=1 00:06:23.578 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:23.578 ' 00:06:23.578 01:22:09 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:23.578 01:22:09 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:23.578 01:22:09 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:23.578 01:22:09 env -- common/autotest_common.sh@10 -- # set +x 00:06:23.578 ************************************ 00:06:23.578 START TEST env_memory 00:06:23.578 ************************************ 00:06:23.578 01:22:09 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:23.578 00:06:23.578 00:06:23.578 CUnit - A unit testing framework for C - Version 2.1-3 00:06:23.578 http://cunit.sourceforge.net/ 00:06:23.578 00:06:23.578 00:06:23.578 Suite: memory 00:06:23.578 Test: alloc and free memory map ...[2024-12-17 01:22:09.574509] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:23.837 passed 00:06:23.837 Test: mem map translation ...[2024-12-17 01:22:09.588444] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:23.837 [2024-12-17 01:22:09.588462] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:23.837 [2024-12-17 01:22:09.588493] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:23.837 [2024-12-17 01:22:09.588502] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:23.837 passed 00:06:23.837 Test: mem map registration ...[2024-12-17 01:22:09.608768] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:23.837 [2024-12-17 01:22:09.608782] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:23.837 passed 00:06:23.837 Test: mem map adjacent registrations ...passed 00:06:23.837 00:06:23.837 Run Summary: Type Total Ran Passed Failed Inactive 00:06:23.837 suites 1 1 n/a 0 0 00:06:23.837 tests 4 4 4 0 0 00:06:23.837 asserts 152 152 152 0 n/a 00:06:23.837 00:06:23.837 Elapsed time = 0.086 seconds 00:06:23.837 00:06:23.837 real 0m0.099s 00:06:23.837 user 0m0.086s 00:06:23.837 sys 0m0.013s 00:06:23.837 01:22:09 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:23.837 01:22:09 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:23.837 ************************************ 00:06:23.837 END TEST env_memory 00:06:23.837 ************************************ 00:06:23.837 01:22:09 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:23.837 01:22:09 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:23.837 01:22:09 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:23.837 01:22:09 env -- common/autotest_common.sh@10 -- # set +x 00:06:23.837 ************************************ 00:06:23.837 START TEST env_vtophys 00:06:23.837 ************************************ 00:06:23.837 01:22:09 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:23.837 EAL: lib.eal log level changed from notice to debug 00:06:23.837 EAL: Detected lcore 0 as core 0 on socket 0 00:06:23.837 EAL: Detected lcore 1 as core 1 on socket 0 00:06:23.837 EAL: Detected lcore 2 as core 2 on socket 0 00:06:23.837 EAL: Detected lcore 3 as core 3 on socket 0 00:06:23.837 EAL: Detected lcore 4 as core 4 on socket 0 00:06:23.837 EAL: Detected lcore 5 as core 5 on socket 0 00:06:23.837 EAL: Detected lcore 6 as core 6 on socket 0 00:06:23.837 EAL: Detected lcore 7 as core 8 on socket 0 00:06:23.837 EAL: Detected lcore 8 as core 9 on socket 0 00:06:23.837 EAL: Detected lcore 9 as core 10 on socket 0 00:06:23.837 EAL: Detected lcore 10 as core 11 on socket 0 00:06:23.837 EAL: Detected lcore 11 as core 12 on socket 0 00:06:23.837 EAL: Detected lcore 12 as core 13 on socket 0 00:06:23.837 EAL: Detected lcore 13 as core 14 on socket 0 00:06:23.837 EAL: Detected lcore 14 as core 16 on socket 0 00:06:23.837 EAL: Detected lcore 15 as core 17 on socket 0 00:06:23.837 EAL: Detected lcore 16 as core 18 on socket 0 00:06:23.837 EAL: Detected lcore 17 as core 19 on socket 0 00:06:23.837 EAL: Detected lcore 18 as core 20 on socket 0 00:06:23.837 EAL: Detected lcore 19 as core 21 on socket 0 00:06:23.837 EAL: Detected lcore 20 as core 22 on socket 0 00:06:23.837 EAL: Detected lcore 21 as core 24 on socket 0 00:06:23.837 EAL: Detected lcore 22 as core 25 on socket 0 00:06:23.837 EAL: Detected lcore 23 as core 26 on socket 0 00:06:23.837 EAL: Detected lcore 24 as core 27 on socket 0 00:06:23.837 EAL: Detected lcore 25 as core 28 on socket 0 00:06:23.837 EAL: Detected lcore 26 as core 29 on socket 0 00:06:23.837 EAL: Detected lcore 27 as core 30 on socket 0 00:06:23.837 EAL: Detected lcore 28 as core 0 on socket 1 00:06:23.837 EAL: Detected lcore 29 as core 1 on socket 1 00:06:23.837 EAL: Detected lcore 30 as core 2 on socket 1 00:06:23.837 EAL: Detected lcore 31 as core 3 on socket 1 00:06:23.837 EAL: Detected lcore 32 as core 4 on socket 1 00:06:23.837 EAL: Detected lcore 33 as core 5 on socket 1 00:06:23.837 EAL: Detected lcore 34 as core 6 on socket 1 00:06:23.837 EAL: Detected lcore 35 as core 8 on socket 1 00:06:23.837 EAL: Detected lcore 36 as core 9 on socket 1 00:06:23.837 EAL: Detected lcore 37 as core 10 on socket 1 00:06:23.837 EAL: Detected lcore 38 as core 11 on socket 1 00:06:23.837 EAL: Detected lcore 39 as core 12 on socket 1 00:06:23.837 EAL: Detected lcore 40 as core 13 on socket 1 00:06:23.837 EAL: Detected lcore 41 as core 14 on socket 1 00:06:23.837 EAL: Detected lcore 42 as core 16 on socket 1 00:06:23.837 EAL: Detected lcore 43 as core 17 on socket 1 00:06:23.837 EAL: Detected lcore 44 as core 18 on socket 1 00:06:23.837 EAL: Detected lcore 45 as core 19 on socket 1 00:06:23.837 EAL: Detected lcore 46 as core 20 on socket 1 00:06:23.837 EAL: Detected lcore 47 as core 21 on socket 1 00:06:23.837 EAL: Detected lcore 48 as core 22 on socket 1 00:06:23.837 EAL: Detected lcore 49 as core 24 on socket 1 00:06:23.837 EAL: Detected lcore 50 as core 25 on socket 1 00:06:23.837 EAL: Detected lcore 51 as core 26 on socket 1 00:06:23.837 EAL: Detected lcore 52 as core 27 on socket 1 00:06:23.837 EAL: Detected lcore 53 as core 28 on socket 1 00:06:23.837 EAL: Detected lcore 54 as core 29 on socket 1 00:06:23.837 EAL: Detected lcore 55 as core 30 on socket 1 00:06:23.837 EAL: Detected lcore 56 as core 0 on socket 0 00:06:23.837 EAL: Detected lcore 57 as core 1 on socket 0 00:06:23.837 EAL: Detected lcore 58 as core 2 on socket 0 00:06:23.837 EAL: Detected lcore 59 as core 3 on socket 0 00:06:23.837 EAL: Detected lcore 60 as core 4 on socket 0 00:06:23.837 EAL: Detected lcore 61 as core 5 on socket 0 00:06:23.837 EAL: Detected lcore 62 as core 6 on socket 0 00:06:23.837 EAL: Detected lcore 63 as core 8 on socket 0 00:06:23.837 EAL: Detected lcore 64 as core 9 on socket 0 00:06:23.837 EAL: Detected lcore 65 as core 10 on socket 0 00:06:23.837 EAL: Detected lcore 66 as core 11 on socket 0 00:06:23.837 EAL: Detected lcore 67 as core 12 on socket 0 00:06:23.837 EAL: Detected lcore 68 as core 13 on socket 0 00:06:23.837 EAL: Detected lcore 69 as core 14 on socket 0 00:06:23.837 EAL: Detected lcore 70 as core 16 on socket 0 00:06:23.837 EAL: Detected lcore 71 as core 17 on socket 0 00:06:23.837 EAL: Detected lcore 72 as core 18 on socket 0 00:06:23.837 EAL: Detected lcore 73 as core 19 on socket 0 00:06:23.837 EAL: Detected lcore 74 as core 20 on socket 0 00:06:23.837 EAL: Detected lcore 75 as core 21 on socket 0 00:06:23.837 EAL: Detected lcore 76 as core 22 on socket 0 00:06:23.837 EAL: Detected lcore 77 as core 24 on socket 0 00:06:23.837 EAL: Detected lcore 78 as core 25 on socket 0 00:06:23.837 EAL: Detected lcore 79 as core 26 on socket 0 00:06:23.837 EAL: Detected lcore 80 as core 27 on socket 0 00:06:23.837 EAL: Detected lcore 81 as core 28 on socket 0 00:06:23.837 EAL: Detected lcore 82 as core 29 on socket 0 00:06:23.837 EAL: Detected lcore 83 as core 30 on socket 0 00:06:23.837 EAL: Detected lcore 84 as core 0 on socket 1 00:06:23.837 EAL: Detected lcore 85 as core 1 on socket 1 00:06:23.837 EAL: Detected lcore 86 as core 2 on socket 1 00:06:23.837 EAL: Detected lcore 87 as core 3 on socket 1 00:06:23.837 EAL: Detected lcore 88 as core 4 on socket 1 00:06:23.837 EAL: Detected lcore 89 as core 5 on socket 1 00:06:23.837 EAL: Detected lcore 90 as core 6 on socket 1 00:06:23.837 EAL: Detected lcore 91 as core 8 on socket 1 00:06:23.837 EAL: Detected lcore 92 as core 9 on socket 1 00:06:23.838 EAL: Detected lcore 93 as core 10 on socket 1 00:06:23.838 EAL: Detected lcore 94 as core 11 on socket 1 00:06:23.838 EAL: Detected lcore 95 as core 12 on socket 1 00:06:23.838 EAL: Detected lcore 96 as core 13 on socket 1 00:06:23.838 EAL: Detected lcore 97 as core 14 on socket 1 00:06:23.838 EAL: Detected lcore 98 as core 16 on socket 1 00:06:23.838 EAL: Detected lcore 99 as core 17 on socket 1 00:06:23.838 EAL: Detected lcore 100 as core 18 on socket 1 00:06:23.838 EAL: Detected lcore 101 as core 19 on socket 1 00:06:23.838 EAL: Detected lcore 102 as core 20 on socket 1 00:06:23.838 EAL: Detected lcore 103 as core 21 on socket 1 00:06:23.838 EAL: Detected lcore 104 as core 22 on socket 1 00:06:23.838 EAL: Detected lcore 105 as core 24 on socket 1 00:06:23.838 EAL: Detected lcore 106 as core 25 on socket 1 00:06:23.838 EAL: Detected lcore 107 as core 26 on socket 1 00:06:23.838 EAL: Detected lcore 108 as core 27 on socket 1 00:06:23.838 EAL: Detected lcore 109 as core 28 on socket 1 00:06:23.838 EAL: Detected lcore 110 as core 29 on socket 1 00:06:23.838 EAL: Detected lcore 111 as core 30 on socket 1 00:06:23.838 EAL: Maximum logical cores by configuration: 128 00:06:23.838 EAL: Detected CPU lcores: 112 00:06:23.838 EAL: Detected NUMA nodes: 2 00:06:23.838 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:06:23.838 EAL: Checking presence of .so 'librte_eal.so.23' 00:06:23.838 EAL: Checking presence of .so 'librte_eal.so' 00:06:23.838 EAL: Detected static linkage of DPDK 00:06:23.838 EAL: No shared files mode enabled, IPC will be disabled 00:06:23.838 EAL: Bus pci wants IOVA as 'DC' 00:06:23.838 EAL: Buses did not request a specific IOVA mode. 00:06:23.838 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:23.838 EAL: Selected IOVA mode 'VA' 00:06:23.838 EAL: Probing VFIO support... 00:06:23.838 EAL: IOMMU type 1 (Type 1) is supported 00:06:23.838 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:23.838 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:23.838 EAL: VFIO support initialized 00:06:23.838 EAL: Ask a virtual area of 0x2e000 bytes 00:06:23.838 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:23.838 EAL: Setting up physically contiguous memory... 00:06:23.838 EAL: Setting maximum number of open files to 524288 00:06:23.838 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:23.838 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:23.838 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:23.838 EAL: Ask a virtual area of 0x61000 bytes 00:06:23.838 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:23.838 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:23.838 EAL: Ask a virtual area of 0x400000000 bytes 00:06:23.838 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:23.838 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:23.838 EAL: Ask a virtual area of 0x61000 bytes 00:06:23.838 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:23.838 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:23.838 EAL: Ask a virtual area of 0x400000000 bytes 00:06:23.838 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:23.838 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:23.838 EAL: Ask a virtual area of 0x61000 bytes 00:06:23.838 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:23.838 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:23.838 EAL: Ask a virtual area of 0x400000000 bytes 00:06:23.838 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:23.838 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:23.838 EAL: Ask a virtual area of 0x61000 bytes 00:06:23.838 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:23.838 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:23.838 EAL: Ask a virtual area of 0x400000000 bytes 00:06:23.838 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:23.838 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:23.838 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:23.838 EAL: Ask a virtual area of 0x61000 bytes 00:06:23.838 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:23.838 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:23.838 EAL: Ask a virtual area of 0x400000000 bytes 00:06:23.838 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:23.838 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:23.838 EAL: Ask a virtual area of 0x61000 bytes 00:06:23.838 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:23.838 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:23.838 EAL: Ask a virtual area of 0x400000000 bytes 00:06:23.838 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:23.838 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:23.838 EAL: Ask a virtual area of 0x61000 bytes 00:06:23.838 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:23.838 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:23.838 EAL: Ask a virtual area of 0x400000000 bytes 00:06:23.838 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:23.838 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:23.838 EAL: Ask a virtual area of 0x61000 bytes 00:06:23.838 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:23.838 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:23.838 EAL: Ask a virtual area of 0x400000000 bytes 00:06:23.838 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:23.838 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:23.838 EAL: Hugepages will be freed exactly as allocated. 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: TSC frequency is ~2500000 KHz 00:06:23.838 EAL: Main lcore 0 is ready (tid=7f7dc08bda00;cpuset=[0]) 00:06:23.838 EAL: Trying to obtain current memory policy. 00:06:23.838 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:23.838 EAL: Restoring previous memory policy: 0 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was expanded by 2MB 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Mem event callback 'spdk:(nil)' registered 00:06:23.838 00:06:23.838 00:06:23.838 CUnit - A unit testing framework for C - Version 2.1-3 00:06:23.838 http://cunit.sourceforge.net/ 00:06:23.838 00:06:23.838 00:06:23.838 Suite: components_suite 00:06:23.838 Test: vtophys_malloc_test ...passed 00:06:23.838 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:23.838 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:23.838 EAL: Restoring previous memory policy: 4 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was expanded by 4MB 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was shrunk by 4MB 00:06:23.838 EAL: Trying to obtain current memory policy. 00:06:23.838 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:23.838 EAL: Restoring previous memory policy: 4 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was expanded by 6MB 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was shrunk by 6MB 00:06:23.838 EAL: Trying to obtain current memory policy. 00:06:23.838 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:23.838 EAL: Restoring previous memory policy: 4 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was expanded by 10MB 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was shrunk by 10MB 00:06:23.838 EAL: Trying to obtain current memory policy. 00:06:23.838 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:23.838 EAL: Restoring previous memory policy: 4 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was expanded by 18MB 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was shrunk by 18MB 00:06:23.838 EAL: Trying to obtain current memory policy. 00:06:23.838 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:23.838 EAL: Restoring previous memory policy: 4 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was expanded by 34MB 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was shrunk by 34MB 00:06:23.838 EAL: Trying to obtain current memory policy. 00:06:23.838 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:23.838 EAL: Restoring previous memory policy: 4 00:06:23.838 EAL: Calling mem event callback 'spdk:(nil)' 00:06:23.838 EAL: request: mp_malloc_sync 00:06:23.838 EAL: No shared files mode enabled, IPC is disabled 00:06:23.838 EAL: Heap on socket 0 was expanded by 66MB 00:06:24.097 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.097 EAL: request: mp_malloc_sync 00:06:24.097 EAL: No shared files mode enabled, IPC is disabled 00:06:24.097 EAL: Heap on socket 0 was shrunk by 66MB 00:06:24.097 EAL: Trying to obtain current memory policy. 00:06:24.097 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:24.097 EAL: Restoring previous memory policy: 4 00:06:24.097 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.097 EAL: request: mp_malloc_sync 00:06:24.097 EAL: No shared files mode enabled, IPC is disabled 00:06:24.097 EAL: Heap on socket 0 was expanded by 130MB 00:06:24.097 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.097 EAL: request: mp_malloc_sync 00:06:24.097 EAL: No shared files mode enabled, IPC is disabled 00:06:24.097 EAL: Heap on socket 0 was shrunk by 130MB 00:06:24.097 EAL: Trying to obtain current memory policy. 00:06:24.097 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:24.097 EAL: Restoring previous memory policy: 4 00:06:24.097 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.097 EAL: request: mp_malloc_sync 00:06:24.097 EAL: No shared files mode enabled, IPC is disabled 00:06:24.097 EAL: Heap on socket 0 was expanded by 258MB 00:06:24.097 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.097 EAL: request: mp_malloc_sync 00:06:24.097 EAL: No shared files mode enabled, IPC is disabled 00:06:24.097 EAL: Heap on socket 0 was shrunk by 258MB 00:06:24.097 EAL: Trying to obtain current memory policy. 00:06:24.097 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:24.355 EAL: Restoring previous memory policy: 4 00:06:24.355 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.355 EAL: request: mp_malloc_sync 00:06:24.355 EAL: No shared files mode enabled, IPC is disabled 00:06:24.355 EAL: Heap on socket 0 was expanded by 514MB 00:06:24.355 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.355 EAL: request: mp_malloc_sync 00:06:24.355 EAL: No shared files mode enabled, IPC is disabled 00:06:24.355 EAL: Heap on socket 0 was shrunk by 514MB 00:06:24.355 EAL: Trying to obtain current memory policy. 00:06:24.355 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:24.613 EAL: Restoring previous memory policy: 4 00:06:24.613 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.613 EAL: request: mp_malloc_sync 00:06:24.613 EAL: No shared files mode enabled, IPC is disabled 00:06:24.613 EAL: Heap on socket 0 was expanded by 1026MB 00:06:24.872 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.872 EAL: request: mp_malloc_sync 00:06:24.872 EAL: No shared files mode enabled, IPC is disabled 00:06:24.872 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:24.872 passed 00:06:24.872 00:06:24.872 Run Summary: Type Total Ran Passed Failed Inactive 00:06:24.872 suites 1 1 n/a 0 0 00:06:24.872 tests 2 2 2 0 0 00:06:24.872 asserts 497 497 497 0 n/a 00:06:24.872 00:06:24.872 Elapsed time = 0.966 seconds 00:06:24.872 EAL: Calling mem event callback 'spdk:(nil)' 00:06:24.872 EAL: request: mp_malloc_sync 00:06:24.872 EAL: No shared files mode enabled, IPC is disabled 00:06:24.872 EAL: Heap on socket 0 was shrunk by 2MB 00:06:24.872 EAL: No shared files mode enabled, IPC is disabled 00:06:24.872 EAL: No shared files mode enabled, IPC is disabled 00:06:24.872 EAL: No shared files mode enabled, IPC is disabled 00:06:24.872 00:06:24.872 real 0m1.087s 00:06:24.872 user 0m0.622s 00:06:24.872 sys 0m0.437s 00:06:24.872 01:22:10 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:24.872 01:22:10 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:24.872 ************************************ 00:06:24.872 END TEST env_vtophys 00:06:24.872 ************************************ 00:06:24.872 01:22:10 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:24.872 01:22:10 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:24.872 01:22:10 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:24.872 01:22:10 env -- common/autotest_common.sh@10 -- # set +x 00:06:25.130 ************************************ 00:06:25.130 START TEST env_pci 00:06:25.130 ************************************ 00:06:25.130 01:22:10 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:25.130 00:06:25.130 00:06:25.130 CUnit - A unit testing framework for C - Version 2.1-3 00:06:25.130 http://cunit.sourceforge.net/ 00:06:25.130 00:06:25.130 00:06:25.130 Suite: pci 00:06:25.130 Test: pci_hook ...[2024-12-17 01:22:10.912980] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1050:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 808715 has claimed it 00:06:25.130 EAL: Cannot find device (10000:00:01.0) 00:06:25.130 EAL: Failed to attach device on primary process 00:06:25.130 passed 00:06:25.130 00:06:25.130 Run Summary: Type Total Ran Passed Failed Inactive 00:06:25.130 suites 1 1 n/a 0 0 00:06:25.130 tests 1 1 1 0 0 00:06:25.130 asserts 25 25 25 0 n/a 00:06:25.130 00:06:25.130 Elapsed time = 0.035 seconds 00:06:25.130 00:06:25.130 real 0m0.054s 00:06:25.130 user 0m0.013s 00:06:25.130 sys 0m0.041s 00:06:25.130 01:22:10 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:25.130 01:22:10 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:25.130 ************************************ 00:06:25.130 END TEST env_pci 00:06:25.130 ************************************ 00:06:25.130 01:22:10 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:25.130 01:22:10 env -- env/env.sh@15 -- # uname 00:06:25.130 01:22:10 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:25.130 01:22:10 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:25.130 01:22:10 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:25.130 01:22:11 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:06:25.130 01:22:11 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:25.130 01:22:11 env -- common/autotest_common.sh@10 -- # set +x 00:06:25.130 ************************************ 00:06:25.130 START TEST env_dpdk_post_init 00:06:25.130 ************************************ 00:06:25.130 01:22:11 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:25.130 EAL: Detected CPU lcores: 112 00:06:25.130 EAL: Detected NUMA nodes: 2 00:06:25.130 EAL: Detected static linkage of DPDK 00:06:25.130 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:25.130 EAL: Selected IOVA mode 'VA' 00:06:25.130 EAL: VFIO support initialized 00:06:25.130 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:25.389 EAL: Using IOMMU type 1 (Type 1) 00:06:25.968 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:30.219 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:30.219 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:30.219 Starting DPDK initialization... 00:06:30.219 Starting SPDK post initialization... 00:06:30.219 SPDK NVMe probe 00:06:30.219 Attaching to 0000:d8:00.0 00:06:30.219 Attached to 0000:d8:00.0 00:06:30.219 Cleaning up... 00:06:30.219 00:06:30.219 real 0m4.762s 00:06:30.219 user 0m3.577s 00:06:30.219 sys 0m0.431s 00:06:30.219 01:22:15 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:30.219 01:22:15 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:30.219 ************************************ 00:06:30.219 END TEST env_dpdk_post_init 00:06:30.219 ************************************ 00:06:30.219 01:22:15 env -- env/env.sh@26 -- # uname 00:06:30.219 01:22:15 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:30.219 01:22:15 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:30.219 01:22:15 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:30.219 01:22:15 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:30.219 01:22:15 env -- common/autotest_common.sh@10 -- # set +x 00:06:30.219 ************************************ 00:06:30.219 START TEST env_mem_callbacks 00:06:30.219 ************************************ 00:06:30.219 01:22:15 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:30.219 EAL: Detected CPU lcores: 112 00:06:30.219 EAL: Detected NUMA nodes: 2 00:06:30.219 EAL: Detected static linkage of DPDK 00:06:30.219 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:30.219 EAL: Selected IOVA mode 'VA' 00:06:30.219 EAL: VFIO support initialized 00:06:30.219 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:30.219 00:06:30.219 00:06:30.219 CUnit - A unit testing framework for C - Version 2.1-3 00:06:30.219 http://cunit.sourceforge.net/ 00:06:30.219 00:06:30.219 00:06:30.219 Suite: memory 00:06:30.219 Test: test ... 00:06:30.219 register 0x200000200000 2097152 00:06:30.219 malloc 3145728 00:06:30.219 register 0x200000400000 4194304 00:06:30.219 buf 0x200000500000 len 3145728 PASSED 00:06:30.219 malloc 64 00:06:30.219 buf 0x2000004fff40 len 64 PASSED 00:06:30.219 malloc 4194304 00:06:30.219 register 0x200000800000 6291456 00:06:30.219 buf 0x200000a00000 len 4194304 PASSED 00:06:30.219 free 0x200000500000 3145728 00:06:30.219 free 0x2000004fff40 64 00:06:30.219 unregister 0x200000400000 4194304 PASSED 00:06:30.219 free 0x200000a00000 4194304 00:06:30.219 unregister 0x200000800000 6291456 PASSED 00:06:30.219 malloc 8388608 00:06:30.219 register 0x200000400000 10485760 00:06:30.219 buf 0x200000600000 len 8388608 PASSED 00:06:30.219 free 0x200000600000 8388608 00:06:30.219 unregister 0x200000400000 10485760 PASSED 00:06:30.219 passed 00:06:30.219 00:06:30.219 Run Summary: Type Total Ran Passed Failed Inactive 00:06:30.219 suites 1 1 n/a 0 0 00:06:30.219 tests 1 1 1 0 0 00:06:30.219 asserts 15 15 15 0 n/a 00:06:30.219 00:06:30.219 Elapsed time = 0.005 seconds 00:06:30.219 00:06:30.219 real 0m0.062s 00:06:30.219 user 0m0.014s 00:06:30.219 sys 0m0.047s 00:06:30.219 01:22:15 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:30.219 01:22:15 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:30.219 ************************************ 00:06:30.219 END TEST env_mem_callbacks 00:06:30.219 ************************************ 00:06:30.219 00:06:30.219 real 0m6.692s 00:06:30.219 user 0m4.604s 00:06:30.219 sys 0m1.347s 00:06:30.219 01:22:15 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:30.219 01:22:15 env -- common/autotest_common.sh@10 -- # set +x 00:06:30.219 ************************************ 00:06:30.219 END TEST env 00:06:30.219 ************************************ 00:06:30.219 01:22:16 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:30.219 01:22:16 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:30.219 01:22:16 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:30.219 01:22:16 -- common/autotest_common.sh@10 -- # set +x 00:06:30.219 ************************************ 00:06:30.219 START TEST rpc 00:06:30.219 ************************************ 00:06:30.219 01:22:16 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:30.219 * Looking for test storage... 00:06:30.219 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:30.219 01:22:16 rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:30.219 01:22:16 rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:30.219 01:22:16 rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:30.479 01:22:16 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:30.479 01:22:16 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:30.479 01:22:16 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:30.479 01:22:16 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:30.479 01:22:16 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:30.479 01:22:16 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:30.479 01:22:16 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:30.479 01:22:16 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:30.479 01:22:16 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:30.479 01:22:16 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:30.479 01:22:16 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:30.479 01:22:16 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:30.479 01:22:16 rpc -- scripts/common.sh@345 -- # : 1 00:06:30.479 01:22:16 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:30.479 01:22:16 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:30.479 01:22:16 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:30.479 01:22:16 rpc -- scripts/common.sh@353 -- # local d=1 00:06:30.479 01:22:16 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:30.479 01:22:16 rpc -- scripts/common.sh@355 -- # echo 1 00:06:30.479 01:22:16 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:30.479 01:22:16 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:30.479 01:22:16 rpc -- scripts/common.sh@353 -- # local d=2 00:06:30.479 01:22:16 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:30.479 01:22:16 rpc -- scripts/common.sh@355 -- # echo 2 00:06:30.479 01:22:16 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:30.479 01:22:16 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:30.479 01:22:16 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:30.479 01:22:16 rpc -- scripts/common.sh@368 -- # return 0 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:30.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.479 --rc genhtml_branch_coverage=1 00:06:30.479 --rc genhtml_function_coverage=1 00:06:30.479 --rc genhtml_legend=1 00:06:30.479 --rc geninfo_all_blocks=1 00:06:30.479 --rc geninfo_unexecuted_blocks=1 00:06:30.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.479 ' 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:30.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.479 --rc genhtml_branch_coverage=1 00:06:30.479 --rc genhtml_function_coverage=1 00:06:30.479 --rc genhtml_legend=1 00:06:30.479 --rc geninfo_all_blocks=1 00:06:30.479 --rc geninfo_unexecuted_blocks=1 00:06:30.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.479 ' 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:30.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.479 --rc genhtml_branch_coverage=1 00:06:30.479 --rc genhtml_function_coverage=1 00:06:30.479 --rc genhtml_legend=1 00:06:30.479 --rc geninfo_all_blocks=1 00:06:30.479 --rc geninfo_unexecuted_blocks=1 00:06:30.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.479 ' 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:30.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.479 --rc genhtml_branch_coverage=1 00:06:30.479 --rc genhtml_function_coverage=1 00:06:30.479 --rc genhtml_legend=1 00:06:30.479 --rc geninfo_all_blocks=1 00:06:30.479 --rc geninfo_unexecuted_blocks=1 00:06:30.479 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.479 ' 00:06:30.479 01:22:16 rpc -- rpc/rpc.sh@65 -- # spdk_pid=809884 00:06:30.479 01:22:16 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:30.479 01:22:16 rpc -- rpc/rpc.sh@67 -- # waitforlisten 809884 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@831 -- # '[' -z 809884 ']' 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.479 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:30.479 01:22:16 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:30.479 01:22:16 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:30.479 [2024-12-17 01:22:16.289859] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:30.479 [2024-12-17 01:22:16.289942] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid809884 ] 00:06:30.479 [2024-12-17 01:22:16.356510] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.479 [2024-12-17 01:22:16.396005] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:30.479 [2024-12-17 01:22:16.396042] app.c: 614:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 809884' to capture a snapshot of events at runtime. 00:06:30.479 [2024-12-17 01:22:16.396052] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:30.479 [2024-12-17 01:22:16.396060] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:30.479 [2024-12-17 01:22:16.396066] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid809884 for offline analysis/debug. 00:06:30.479 [2024-12-17 01:22:16.396098] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.738 01:22:16 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:30.738 01:22:16 rpc -- common/autotest_common.sh@864 -- # return 0 00:06:30.738 01:22:16 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:30.738 01:22:16 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:30.738 01:22:16 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:30.739 01:22:16 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:30.739 01:22:16 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:30.739 01:22:16 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:30.739 01:22:16 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:30.739 ************************************ 00:06:30.739 START TEST rpc_integrity 00:06:30.739 ************************************ 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.739 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:30.739 { 00:06:30.739 "name": "Malloc0", 00:06:30.739 "aliases": [ 00:06:30.739 "b01f3429-5c2e-4843-8e95-fb21936d8796" 00:06:30.739 ], 00:06:30.739 "product_name": "Malloc disk", 00:06:30.739 "block_size": 512, 00:06:30.739 "num_blocks": 16384, 00:06:30.739 "uuid": "b01f3429-5c2e-4843-8e95-fb21936d8796", 00:06:30.739 "assigned_rate_limits": { 00:06:30.739 "rw_ios_per_sec": 0, 00:06:30.739 "rw_mbytes_per_sec": 0, 00:06:30.739 "r_mbytes_per_sec": 0, 00:06:30.739 "w_mbytes_per_sec": 0 00:06:30.739 }, 00:06:30.739 "claimed": false, 00:06:30.739 "zoned": false, 00:06:30.739 "supported_io_types": { 00:06:30.739 "read": true, 00:06:30.739 "write": true, 00:06:30.739 "unmap": true, 00:06:30.739 "flush": true, 00:06:30.739 "reset": true, 00:06:30.739 "nvme_admin": false, 00:06:30.739 "nvme_io": false, 00:06:30.739 "nvme_io_md": false, 00:06:30.739 "write_zeroes": true, 00:06:30.739 "zcopy": true, 00:06:30.739 "get_zone_info": false, 00:06:30.739 "zone_management": false, 00:06:30.739 "zone_append": false, 00:06:30.739 "compare": false, 00:06:30.739 "compare_and_write": false, 00:06:30.739 "abort": true, 00:06:30.739 "seek_hole": false, 00:06:30.739 "seek_data": false, 00:06:30.739 "copy": true, 00:06:30.739 "nvme_iov_md": false 00:06:30.739 }, 00:06:30.739 "memory_domains": [ 00:06:30.739 { 00:06:30.739 "dma_device_id": "system", 00:06:30.739 "dma_device_type": 1 00:06:30.739 }, 00:06:30.739 { 00:06:30.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:30.739 "dma_device_type": 2 00:06:30.739 } 00:06:30.739 ], 00:06:30.739 "driver_specific": {} 00:06:30.739 } 00:06:30.739 ]' 00:06:30.739 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:30.997 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:30.997 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 [2024-12-17 01:22:16.747230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:30.998 [2024-12-17 01:22:16.747262] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:30.998 [2024-12-17 01:22:16.747278] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x58b1bc0 00:06:30.998 [2024-12-17 01:22:16.747287] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:30.998 [2024-12-17 01:22:16.748084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:30.998 [2024-12-17 01:22:16.748106] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:30.998 Passthru0 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:30.998 { 00:06:30.998 "name": "Malloc0", 00:06:30.998 "aliases": [ 00:06:30.998 "b01f3429-5c2e-4843-8e95-fb21936d8796" 00:06:30.998 ], 00:06:30.998 "product_name": "Malloc disk", 00:06:30.998 "block_size": 512, 00:06:30.998 "num_blocks": 16384, 00:06:30.998 "uuid": "b01f3429-5c2e-4843-8e95-fb21936d8796", 00:06:30.998 "assigned_rate_limits": { 00:06:30.998 "rw_ios_per_sec": 0, 00:06:30.998 "rw_mbytes_per_sec": 0, 00:06:30.998 "r_mbytes_per_sec": 0, 00:06:30.998 "w_mbytes_per_sec": 0 00:06:30.998 }, 00:06:30.998 "claimed": true, 00:06:30.998 "claim_type": "exclusive_write", 00:06:30.998 "zoned": false, 00:06:30.998 "supported_io_types": { 00:06:30.998 "read": true, 00:06:30.998 "write": true, 00:06:30.998 "unmap": true, 00:06:30.998 "flush": true, 00:06:30.998 "reset": true, 00:06:30.998 "nvme_admin": false, 00:06:30.998 "nvme_io": false, 00:06:30.998 "nvme_io_md": false, 00:06:30.998 "write_zeroes": true, 00:06:30.998 "zcopy": true, 00:06:30.998 "get_zone_info": false, 00:06:30.998 "zone_management": false, 00:06:30.998 "zone_append": false, 00:06:30.998 "compare": false, 00:06:30.998 "compare_and_write": false, 00:06:30.998 "abort": true, 00:06:30.998 "seek_hole": false, 00:06:30.998 "seek_data": false, 00:06:30.998 "copy": true, 00:06:30.998 "nvme_iov_md": false 00:06:30.998 }, 00:06:30.998 "memory_domains": [ 00:06:30.998 { 00:06:30.998 "dma_device_id": "system", 00:06:30.998 "dma_device_type": 1 00:06:30.998 }, 00:06:30.998 { 00:06:30.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:30.998 "dma_device_type": 2 00:06:30.998 } 00:06:30.998 ], 00:06:30.998 "driver_specific": {} 00:06:30.998 }, 00:06:30.998 { 00:06:30.998 "name": "Passthru0", 00:06:30.998 "aliases": [ 00:06:30.998 "d21661e1-79df-5ff6-bd57-c55bc44aef44" 00:06:30.998 ], 00:06:30.998 "product_name": "passthru", 00:06:30.998 "block_size": 512, 00:06:30.998 "num_blocks": 16384, 00:06:30.998 "uuid": "d21661e1-79df-5ff6-bd57-c55bc44aef44", 00:06:30.998 "assigned_rate_limits": { 00:06:30.998 "rw_ios_per_sec": 0, 00:06:30.998 "rw_mbytes_per_sec": 0, 00:06:30.998 "r_mbytes_per_sec": 0, 00:06:30.998 "w_mbytes_per_sec": 0 00:06:30.998 }, 00:06:30.998 "claimed": false, 00:06:30.998 "zoned": false, 00:06:30.998 "supported_io_types": { 00:06:30.998 "read": true, 00:06:30.998 "write": true, 00:06:30.998 "unmap": true, 00:06:30.998 "flush": true, 00:06:30.998 "reset": true, 00:06:30.998 "nvme_admin": false, 00:06:30.998 "nvme_io": false, 00:06:30.998 "nvme_io_md": false, 00:06:30.998 "write_zeroes": true, 00:06:30.998 "zcopy": true, 00:06:30.998 "get_zone_info": false, 00:06:30.998 "zone_management": false, 00:06:30.998 "zone_append": false, 00:06:30.998 "compare": false, 00:06:30.998 "compare_and_write": false, 00:06:30.998 "abort": true, 00:06:30.998 "seek_hole": false, 00:06:30.998 "seek_data": false, 00:06:30.998 "copy": true, 00:06:30.998 "nvme_iov_md": false 00:06:30.998 }, 00:06:30.998 "memory_domains": [ 00:06:30.998 { 00:06:30.998 "dma_device_id": "system", 00:06:30.998 "dma_device_type": 1 00:06:30.998 }, 00:06:30.998 { 00:06:30.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:30.998 "dma_device_type": 2 00:06:30.998 } 00:06:30.998 ], 00:06:30.998 "driver_specific": { 00:06:30.998 "passthru": { 00:06:30.998 "name": "Passthru0", 00:06:30.998 "base_bdev_name": "Malloc0" 00:06:30.998 } 00:06:30.998 } 00:06:30.998 } 00:06:30.998 ]' 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:30.998 01:22:16 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:30.998 00:06:30.998 real 0m0.264s 00:06:30.998 user 0m0.159s 00:06:30.998 sys 0m0.038s 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:30.998 01:22:16 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 ************************************ 00:06:30.998 END TEST rpc_integrity 00:06:30.998 ************************************ 00:06:30.998 01:22:16 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:30.998 01:22:16 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:30.998 01:22:16 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:30.998 01:22:16 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 ************************************ 00:06:30.998 START TEST rpc_plugins 00:06:30.998 ************************************ 00:06:30.998 01:22:16 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:06:30.998 01:22:16 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:30.998 01:22:16 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.998 01:22:16 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 01:22:16 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.998 01:22:16 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:30.998 01:22:16 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:30.998 01:22:16 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.998 01:22:16 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:30.998 01:22:16 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.998 01:22:16 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:30.998 { 00:06:30.998 "name": "Malloc1", 00:06:30.998 "aliases": [ 00:06:30.998 "1d7aacc8-6bec-4bed-965b-3206185165ab" 00:06:30.998 ], 00:06:30.998 "product_name": "Malloc disk", 00:06:30.998 "block_size": 4096, 00:06:30.998 "num_blocks": 256, 00:06:30.998 "uuid": "1d7aacc8-6bec-4bed-965b-3206185165ab", 00:06:30.998 "assigned_rate_limits": { 00:06:30.998 "rw_ios_per_sec": 0, 00:06:30.998 "rw_mbytes_per_sec": 0, 00:06:30.998 "r_mbytes_per_sec": 0, 00:06:30.998 "w_mbytes_per_sec": 0 00:06:30.998 }, 00:06:30.998 "claimed": false, 00:06:30.998 "zoned": false, 00:06:30.998 "supported_io_types": { 00:06:30.998 "read": true, 00:06:30.998 "write": true, 00:06:30.998 "unmap": true, 00:06:30.998 "flush": true, 00:06:30.998 "reset": true, 00:06:30.998 "nvme_admin": false, 00:06:30.998 "nvme_io": false, 00:06:30.998 "nvme_io_md": false, 00:06:30.998 "write_zeroes": true, 00:06:30.998 "zcopy": true, 00:06:30.998 "get_zone_info": false, 00:06:30.998 "zone_management": false, 00:06:30.998 "zone_append": false, 00:06:30.998 "compare": false, 00:06:30.998 "compare_and_write": false, 00:06:30.998 "abort": true, 00:06:30.998 "seek_hole": false, 00:06:30.998 "seek_data": false, 00:06:30.998 "copy": true, 00:06:30.998 "nvme_iov_md": false 00:06:30.998 }, 00:06:30.998 "memory_domains": [ 00:06:30.998 { 00:06:30.998 "dma_device_id": "system", 00:06:30.998 "dma_device_type": 1 00:06:30.998 }, 00:06:30.998 { 00:06:30.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:30.998 "dma_device_type": 2 00:06:30.998 } 00:06:30.998 ], 00:06:30.998 "driver_specific": {} 00:06:30.998 } 00:06:30.998 ]' 00:06:30.999 01:22:16 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:31.257 01:22:17 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:31.257 01:22:17 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:31.257 01:22:17 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.257 01:22:17 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:31.257 01:22:17 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.257 01:22:17 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:31.257 01:22:17 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.257 01:22:17 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:31.257 01:22:17 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.257 01:22:17 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:31.257 01:22:17 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:31.257 01:22:17 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:31.257 00:06:31.257 real 0m0.134s 00:06:31.257 user 0m0.085s 00:06:31.257 sys 0m0.017s 00:06:31.257 01:22:17 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:31.257 01:22:17 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:31.257 ************************************ 00:06:31.257 END TEST rpc_plugins 00:06:31.257 ************************************ 00:06:31.257 01:22:17 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:31.257 01:22:17 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:31.257 01:22:17 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:31.257 01:22:17 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:31.257 ************************************ 00:06:31.257 START TEST rpc_trace_cmd_test 00:06:31.257 ************************************ 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:31.258 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid809884", 00:06:31.258 "tpoint_group_mask": "0x8", 00:06:31.258 "iscsi_conn": { 00:06:31.258 "mask": "0x2", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "scsi": { 00:06:31.258 "mask": "0x4", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "bdev": { 00:06:31.258 "mask": "0x8", 00:06:31.258 "tpoint_mask": "0xffffffffffffffff" 00:06:31.258 }, 00:06:31.258 "nvmf_rdma": { 00:06:31.258 "mask": "0x10", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "nvmf_tcp": { 00:06:31.258 "mask": "0x20", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "ftl": { 00:06:31.258 "mask": "0x40", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "blobfs": { 00:06:31.258 "mask": "0x80", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "dsa": { 00:06:31.258 "mask": "0x200", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "thread": { 00:06:31.258 "mask": "0x400", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "nvme_pcie": { 00:06:31.258 "mask": "0x800", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "iaa": { 00:06:31.258 "mask": "0x1000", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "nvme_tcp": { 00:06:31.258 "mask": "0x2000", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "bdev_nvme": { 00:06:31.258 "mask": "0x4000", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "sock": { 00:06:31.258 "mask": "0x8000", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "blob": { 00:06:31.258 "mask": "0x10000", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 }, 00:06:31.258 "bdev_raid": { 00:06:31.258 "mask": "0x20000", 00:06:31.258 "tpoint_mask": "0x0" 00:06:31.258 } 00:06:31.258 }' 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 18 -gt 2 ']' 00:06:31.258 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:31.516 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:31.516 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:31.516 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:31.516 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:31.516 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:31.516 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:31.516 01:22:17 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:31.516 00:06:31.517 real 0m0.237s 00:06:31.517 user 0m0.203s 00:06:31.517 sys 0m0.023s 00:06:31.517 01:22:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:31.517 01:22:17 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:31.517 ************************************ 00:06:31.517 END TEST rpc_trace_cmd_test 00:06:31.517 ************************************ 00:06:31.517 01:22:17 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:31.517 01:22:17 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:31.517 01:22:17 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:31.517 01:22:17 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:31.517 01:22:17 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:31.517 01:22:17 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:31.517 ************************************ 00:06:31.517 START TEST rpc_daemon_integrity 00:06:31.517 ************************************ 00:06:31.517 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:31.517 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:31.517 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.517 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.517 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.517 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:31.517 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:31.775 { 00:06:31.775 "name": "Malloc2", 00:06:31.775 "aliases": [ 00:06:31.775 "adb83aac-127c-4aae-a299-380b6aab6114" 00:06:31.775 ], 00:06:31.775 "product_name": "Malloc disk", 00:06:31.775 "block_size": 512, 00:06:31.775 "num_blocks": 16384, 00:06:31.775 "uuid": "adb83aac-127c-4aae-a299-380b6aab6114", 00:06:31.775 "assigned_rate_limits": { 00:06:31.775 "rw_ios_per_sec": 0, 00:06:31.775 "rw_mbytes_per_sec": 0, 00:06:31.775 "r_mbytes_per_sec": 0, 00:06:31.775 "w_mbytes_per_sec": 0 00:06:31.775 }, 00:06:31.775 "claimed": false, 00:06:31.775 "zoned": false, 00:06:31.775 "supported_io_types": { 00:06:31.775 "read": true, 00:06:31.775 "write": true, 00:06:31.775 "unmap": true, 00:06:31.775 "flush": true, 00:06:31.775 "reset": true, 00:06:31.775 "nvme_admin": false, 00:06:31.775 "nvme_io": false, 00:06:31.775 "nvme_io_md": false, 00:06:31.775 "write_zeroes": true, 00:06:31.775 "zcopy": true, 00:06:31.775 "get_zone_info": false, 00:06:31.775 "zone_management": false, 00:06:31.775 "zone_append": false, 00:06:31.775 "compare": false, 00:06:31.775 "compare_and_write": false, 00:06:31.775 "abort": true, 00:06:31.775 "seek_hole": false, 00:06:31.775 "seek_data": false, 00:06:31.775 "copy": true, 00:06:31.775 "nvme_iov_md": false 00:06:31.775 }, 00:06:31.775 "memory_domains": [ 00:06:31.775 { 00:06:31.775 "dma_device_id": "system", 00:06:31.775 "dma_device_type": 1 00:06:31.775 }, 00:06:31.775 { 00:06:31.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:31.775 "dma_device_type": 2 00:06:31.775 } 00:06:31.775 ], 00:06:31.775 "driver_specific": {} 00:06:31.775 } 00:06:31.775 ]' 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.775 [2024-12-17 01:22:17.629493] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:31.775 [2024-12-17 01:22:17.629524] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:31.775 [2024-12-17 01:22:17.629540] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x58b61d0 00:06:31.775 [2024-12-17 01:22:17.629549] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:31.775 [2024-12-17 01:22:17.630274] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:31.775 [2024-12-17 01:22:17.630294] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:31.775 Passthru0 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.775 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:31.775 { 00:06:31.775 "name": "Malloc2", 00:06:31.775 "aliases": [ 00:06:31.775 "adb83aac-127c-4aae-a299-380b6aab6114" 00:06:31.775 ], 00:06:31.775 "product_name": "Malloc disk", 00:06:31.775 "block_size": 512, 00:06:31.775 "num_blocks": 16384, 00:06:31.775 "uuid": "adb83aac-127c-4aae-a299-380b6aab6114", 00:06:31.775 "assigned_rate_limits": { 00:06:31.775 "rw_ios_per_sec": 0, 00:06:31.775 "rw_mbytes_per_sec": 0, 00:06:31.775 "r_mbytes_per_sec": 0, 00:06:31.775 "w_mbytes_per_sec": 0 00:06:31.775 }, 00:06:31.775 "claimed": true, 00:06:31.775 "claim_type": "exclusive_write", 00:06:31.775 "zoned": false, 00:06:31.775 "supported_io_types": { 00:06:31.775 "read": true, 00:06:31.775 "write": true, 00:06:31.775 "unmap": true, 00:06:31.775 "flush": true, 00:06:31.775 "reset": true, 00:06:31.775 "nvme_admin": false, 00:06:31.775 "nvme_io": false, 00:06:31.775 "nvme_io_md": false, 00:06:31.775 "write_zeroes": true, 00:06:31.775 "zcopy": true, 00:06:31.775 "get_zone_info": false, 00:06:31.775 "zone_management": false, 00:06:31.775 "zone_append": false, 00:06:31.775 "compare": false, 00:06:31.775 "compare_and_write": false, 00:06:31.775 "abort": true, 00:06:31.775 "seek_hole": false, 00:06:31.775 "seek_data": false, 00:06:31.776 "copy": true, 00:06:31.776 "nvme_iov_md": false 00:06:31.776 }, 00:06:31.776 "memory_domains": [ 00:06:31.776 { 00:06:31.776 "dma_device_id": "system", 00:06:31.776 "dma_device_type": 1 00:06:31.776 }, 00:06:31.776 { 00:06:31.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:31.776 "dma_device_type": 2 00:06:31.776 } 00:06:31.776 ], 00:06:31.776 "driver_specific": {} 00:06:31.776 }, 00:06:31.776 { 00:06:31.776 "name": "Passthru0", 00:06:31.776 "aliases": [ 00:06:31.776 "2ca1ea88-7590-57e8-ae0e-989074972c18" 00:06:31.776 ], 00:06:31.776 "product_name": "passthru", 00:06:31.776 "block_size": 512, 00:06:31.776 "num_blocks": 16384, 00:06:31.776 "uuid": "2ca1ea88-7590-57e8-ae0e-989074972c18", 00:06:31.776 "assigned_rate_limits": { 00:06:31.776 "rw_ios_per_sec": 0, 00:06:31.776 "rw_mbytes_per_sec": 0, 00:06:31.776 "r_mbytes_per_sec": 0, 00:06:31.776 "w_mbytes_per_sec": 0 00:06:31.776 }, 00:06:31.776 "claimed": false, 00:06:31.776 "zoned": false, 00:06:31.776 "supported_io_types": { 00:06:31.776 "read": true, 00:06:31.776 "write": true, 00:06:31.776 "unmap": true, 00:06:31.776 "flush": true, 00:06:31.776 "reset": true, 00:06:31.776 "nvme_admin": false, 00:06:31.776 "nvme_io": false, 00:06:31.776 "nvme_io_md": false, 00:06:31.776 "write_zeroes": true, 00:06:31.776 "zcopy": true, 00:06:31.776 "get_zone_info": false, 00:06:31.776 "zone_management": false, 00:06:31.776 "zone_append": false, 00:06:31.776 "compare": false, 00:06:31.776 "compare_and_write": false, 00:06:31.776 "abort": true, 00:06:31.776 "seek_hole": false, 00:06:31.776 "seek_data": false, 00:06:31.776 "copy": true, 00:06:31.776 "nvme_iov_md": false 00:06:31.776 }, 00:06:31.776 "memory_domains": [ 00:06:31.776 { 00:06:31.776 "dma_device_id": "system", 00:06:31.776 "dma_device_type": 1 00:06:31.776 }, 00:06:31.776 { 00:06:31.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:31.776 "dma_device_type": 2 00:06:31.776 } 00:06:31.776 ], 00:06:31.776 "driver_specific": { 00:06:31.776 "passthru": { 00:06:31.776 "name": "Passthru0", 00:06:31.776 "base_bdev_name": "Malloc2" 00:06:31.776 } 00:06:31.776 } 00:06:31.776 } 00:06:31.776 ]' 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:31.776 00:06:31.776 real 0m0.274s 00:06:31.776 user 0m0.175s 00:06:31.776 sys 0m0.038s 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:31.776 01:22:17 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.776 ************************************ 00:06:31.776 END TEST rpc_daemon_integrity 00:06:31.776 ************************************ 00:06:32.034 01:22:17 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:32.034 01:22:17 rpc -- rpc/rpc.sh@84 -- # killprocess 809884 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@950 -- # '[' -z 809884 ']' 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@954 -- # kill -0 809884 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@955 -- # uname 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 809884 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 809884' 00:06:32.034 killing process with pid 809884 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@969 -- # kill 809884 00:06:32.034 01:22:17 rpc -- common/autotest_common.sh@974 -- # wait 809884 00:06:32.293 00:06:32.293 real 0m2.089s 00:06:32.293 user 0m2.652s 00:06:32.293 sys 0m0.746s 00:06:32.294 01:22:18 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:32.294 01:22:18 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:32.294 ************************************ 00:06:32.294 END TEST rpc 00:06:32.294 ************************************ 00:06:32.294 01:22:18 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:32.294 01:22:18 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:32.294 01:22:18 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:32.294 01:22:18 -- common/autotest_common.sh@10 -- # set +x 00:06:32.294 ************************************ 00:06:32.294 START TEST skip_rpc 00:06:32.294 ************************************ 00:06:32.294 01:22:18 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:32.553 * Looking for test storage... 00:06:32.553 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:32.553 01:22:18 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:32.553 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.553 --rc genhtml_branch_coverage=1 00:06:32.553 --rc genhtml_function_coverage=1 00:06:32.553 --rc genhtml_legend=1 00:06:32.553 --rc geninfo_all_blocks=1 00:06:32.553 --rc geninfo_unexecuted_blocks=1 00:06:32.553 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.553 ' 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:32.553 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.553 --rc genhtml_branch_coverage=1 00:06:32.553 --rc genhtml_function_coverage=1 00:06:32.553 --rc genhtml_legend=1 00:06:32.553 --rc geninfo_all_blocks=1 00:06:32.553 --rc geninfo_unexecuted_blocks=1 00:06:32.553 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.553 ' 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:32.553 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.553 --rc genhtml_branch_coverage=1 00:06:32.553 --rc genhtml_function_coverage=1 00:06:32.553 --rc genhtml_legend=1 00:06:32.553 --rc geninfo_all_blocks=1 00:06:32.553 --rc geninfo_unexecuted_blocks=1 00:06:32.553 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.553 ' 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:32.553 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.553 --rc genhtml_branch_coverage=1 00:06:32.553 --rc genhtml_function_coverage=1 00:06:32.553 --rc genhtml_legend=1 00:06:32.553 --rc geninfo_all_blocks=1 00:06:32.553 --rc geninfo_unexecuted_blocks=1 00:06:32.553 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.553 ' 00:06:32.553 01:22:18 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:32.553 01:22:18 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:32.553 01:22:18 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:32.553 01:22:18 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:32.553 ************************************ 00:06:32.553 START TEST skip_rpc 00:06:32.553 ************************************ 00:06:32.553 01:22:18 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:06:32.553 01:22:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=810344 00:06:32.553 01:22:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:32.553 01:22:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:32.553 01:22:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:32.553 [2024-12-17 01:22:18.507332] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:32.553 [2024-12-17 01:22:18.507390] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid810344 ] 00:06:32.812 [2024-12-17 01:22:18.572347] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.812 [2024-12-17 01:22:18.608911] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 810344 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 810344 ']' 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 810344 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 810344 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 810344' 00:06:38.082 killing process with pid 810344 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 810344 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 810344 00:06:38.082 00:06:38.082 real 0m5.386s 00:06:38.082 user 0m5.133s 00:06:38.082 sys 0m0.290s 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:38.082 01:22:23 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:38.082 ************************************ 00:06:38.082 END TEST skip_rpc 00:06:38.082 ************************************ 00:06:38.082 01:22:23 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:38.082 01:22:23 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:38.082 01:22:23 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:38.082 01:22:23 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:38.082 ************************************ 00:06:38.082 START TEST skip_rpc_with_json 00:06:38.082 ************************************ 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=811428 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 811428 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 811428 ']' 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:38.082 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:38.082 01:22:23 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:38.082 [2024-12-17 01:22:23.977567] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:38.082 [2024-12-17 01:22:23.977651] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid811428 ] 00:06:38.082 [2024-12-17 01:22:24.042920] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.082 [2024-12-17 01:22:24.078542] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:38.341 [2024-12-17 01:22:24.275660] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:38.341 request: 00:06:38.341 { 00:06:38.341 "trtype": "tcp", 00:06:38.341 "method": "nvmf_get_transports", 00:06:38.341 "req_id": 1 00:06:38.341 } 00:06:38.341 Got JSON-RPC error response 00:06:38.341 response: 00:06:38.341 { 00:06:38.341 "code": -19, 00:06:38.341 "message": "No such device" 00:06:38.341 } 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:38.341 [2024-12-17 01:22:24.283739] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:38.341 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:38.600 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:38.600 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:38.600 { 00:06:38.600 "subsystems": [ 00:06:38.600 { 00:06:38.600 "subsystem": "scheduler", 00:06:38.600 "config": [ 00:06:38.600 { 00:06:38.600 "method": "framework_set_scheduler", 00:06:38.600 "params": { 00:06:38.600 "name": "static" 00:06:38.600 } 00:06:38.600 } 00:06:38.600 ] 00:06:38.600 }, 00:06:38.600 { 00:06:38.600 "subsystem": "vmd", 00:06:38.600 "config": [] 00:06:38.600 }, 00:06:38.600 { 00:06:38.600 "subsystem": "sock", 00:06:38.600 "config": [ 00:06:38.600 { 00:06:38.600 "method": "sock_set_default_impl", 00:06:38.600 "params": { 00:06:38.600 "impl_name": "posix" 00:06:38.600 } 00:06:38.600 }, 00:06:38.600 { 00:06:38.600 "method": "sock_impl_set_options", 00:06:38.600 "params": { 00:06:38.600 "impl_name": "ssl", 00:06:38.600 "recv_buf_size": 4096, 00:06:38.600 "send_buf_size": 4096, 00:06:38.600 "enable_recv_pipe": true, 00:06:38.600 "enable_quickack": false, 00:06:38.600 "enable_placement_id": 0, 00:06:38.600 "enable_zerocopy_send_server": true, 00:06:38.600 "enable_zerocopy_send_client": false, 00:06:38.600 "zerocopy_threshold": 0, 00:06:38.600 "tls_version": 0, 00:06:38.600 "enable_ktls": false 00:06:38.600 } 00:06:38.600 }, 00:06:38.600 { 00:06:38.600 "method": "sock_impl_set_options", 00:06:38.600 "params": { 00:06:38.601 "impl_name": "posix", 00:06:38.601 "recv_buf_size": 2097152, 00:06:38.601 "send_buf_size": 2097152, 00:06:38.601 "enable_recv_pipe": true, 00:06:38.601 "enable_quickack": false, 00:06:38.601 "enable_placement_id": 0, 00:06:38.601 "enable_zerocopy_send_server": true, 00:06:38.601 "enable_zerocopy_send_client": false, 00:06:38.601 "zerocopy_threshold": 0, 00:06:38.601 "tls_version": 0, 00:06:38.601 "enable_ktls": false 00:06:38.601 } 00:06:38.601 } 00:06:38.601 ] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "iobuf", 00:06:38.601 "config": [ 00:06:38.601 { 00:06:38.601 "method": "iobuf_set_options", 00:06:38.601 "params": { 00:06:38.601 "small_pool_count": 8192, 00:06:38.601 "large_pool_count": 1024, 00:06:38.601 "small_bufsize": 8192, 00:06:38.601 "large_bufsize": 135168 00:06:38.601 } 00:06:38.601 } 00:06:38.601 ] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "keyring", 00:06:38.601 "config": [] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "vfio_user_target", 00:06:38.601 "config": null 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "fsdev", 00:06:38.601 "config": [ 00:06:38.601 { 00:06:38.601 "method": "fsdev_set_opts", 00:06:38.601 "params": { 00:06:38.601 "fsdev_io_pool_size": 65535, 00:06:38.601 "fsdev_io_cache_size": 256 00:06:38.601 } 00:06:38.601 } 00:06:38.601 ] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "accel", 00:06:38.601 "config": [ 00:06:38.601 { 00:06:38.601 "method": "accel_set_options", 00:06:38.601 "params": { 00:06:38.601 "small_cache_size": 128, 00:06:38.601 "large_cache_size": 16, 00:06:38.601 "task_count": 2048, 00:06:38.601 "sequence_count": 2048, 00:06:38.601 "buf_count": 2048 00:06:38.601 } 00:06:38.601 } 00:06:38.601 ] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "bdev", 00:06:38.601 "config": [ 00:06:38.601 { 00:06:38.601 "method": "bdev_set_options", 00:06:38.601 "params": { 00:06:38.601 "bdev_io_pool_size": 65535, 00:06:38.601 "bdev_io_cache_size": 256, 00:06:38.601 "bdev_auto_examine": true, 00:06:38.601 "iobuf_small_cache_size": 128, 00:06:38.601 "iobuf_large_cache_size": 16 00:06:38.601 } 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "method": "bdev_raid_set_options", 00:06:38.601 "params": { 00:06:38.601 "process_window_size_kb": 1024, 00:06:38.601 "process_max_bandwidth_mb_sec": 0 00:06:38.601 } 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "method": "bdev_nvme_set_options", 00:06:38.601 "params": { 00:06:38.601 "action_on_timeout": "none", 00:06:38.601 "timeout_us": 0, 00:06:38.601 "timeout_admin_us": 0, 00:06:38.601 "keep_alive_timeout_ms": 10000, 00:06:38.601 "arbitration_burst": 0, 00:06:38.601 "low_priority_weight": 0, 00:06:38.601 "medium_priority_weight": 0, 00:06:38.601 "high_priority_weight": 0, 00:06:38.601 "nvme_adminq_poll_period_us": 10000, 00:06:38.601 "nvme_ioq_poll_period_us": 0, 00:06:38.601 "io_queue_requests": 0, 00:06:38.601 "delay_cmd_submit": true, 00:06:38.601 "transport_retry_count": 4, 00:06:38.601 "bdev_retry_count": 3, 00:06:38.601 "transport_ack_timeout": 0, 00:06:38.601 "ctrlr_loss_timeout_sec": 0, 00:06:38.601 "reconnect_delay_sec": 0, 00:06:38.601 "fast_io_fail_timeout_sec": 0, 00:06:38.601 "disable_auto_failback": false, 00:06:38.601 "generate_uuids": false, 00:06:38.601 "transport_tos": 0, 00:06:38.601 "nvme_error_stat": false, 00:06:38.601 "rdma_srq_size": 0, 00:06:38.601 "io_path_stat": false, 00:06:38.601 "allow_accel_sequence": false, 00:06:38.601 "rdma_max_cq_size": 0, 00:06:38.601 "rdma_cm_event_timeout_ms": 0, 00:06:38.601 "dhchap_digests": [ 00:06:38.601 "sha256", 00:06:38.601 "sha384", 00:06:38.601 "sha512" 00:06:38.601 ], 00:06:38.601 "dhchap_dhgroups": [ 00:06:38.601 "null", 00:06:38.601 "ffdhe2048", 00:06:38.601 "ffdhe3072", 00:06:38.601 "ffdhe4096", 00:06:38.601 "ffdhe6144", 00:06:38.601 "ffdhe8192" 00:06:38.601 ] 00:06:38.601 } 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "method": "bdev_nvme_set_hotplug", 00:06:38.601 "params": { 00:06:38.601 "period_us": 100000, 00:06:38.601 "enable": false 00:06:38.601 } 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "method": "bdev_iscsi_set_options", 00:06:38.601 "params": { 00:06:38.601 "timeout_sec": 30 00:06:38.601 } 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "method": "bdev_wait_for_examine" 00:06:38.601 } 00:06:38.601 ] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "nvmf", 00:06:38.601 "config": [ 00:06:38.601 { 00:06:38.601 "method": "nvmf_set_config", 00:06:38.601 "params": { 00:06:38.601 "discovery_filter": "match_any", 00:06:38.601 "admin_cmd_passthru": { 00:06:38.601 "identify_ctrlr": false 00:06:38.601 }, 00:06:38.601 "dhchap_digests": [ 00:06:38.601 "sha256", 00:06:38.601 "sha384", 00:06:38.601 "sha512" 00:06:38.601 ], 00:06:38.601 "dhchap_dhgroups": [ 00:06:38.601 "null", 00:06:38.601 "ffdhe2048", 00:06:38.601 "ffdhe3072", 00:06:38.601 "ffdhe4096", 00:06:38.601 "ffdhe6144", 00:06:38.601 "ffdhe8192" 00:06:38.601 ] 00:06:38.601 } 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "method": "nvmf_set_max_subsystems", 00:06:38.601 "params": { 00:06:38.601 "max_subsystems": 1024 00:06:38.601 } 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "method": "nvmf_set_crdt", 00:06:38.601 "params": { 00:06:38.601 "crdt1": 0, 00:06:38.601 "crdt2": 0, 00:06:38.601 "crdt3": 0 00:06:38.601 } 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "method": "nvmf_create_transport", 00:06:38.601 "params": { 00:06:38.601 "trtype": "TCP", 00:06:38.601 "max_queue_depth": 128, 00:06:38.601 "max_io_qpairs_per_ctrlr": 127, 00:06:38.601 "in_capsule_data_size": 4096, 00:06:38.601 "max_io_size": 131072, 00:06:38.601 "io_unit_size": 131072, 00:06:38.601 "max_aq_depth": 128, 00:06:38.601 "num_shared_buffers": 511, 00:06:38.601 "buf_cache_size": 4294967295, 00:06:38.601 "dif_insert_or_strip": false, 00:06:38.601 "zcopy": false, 00:06:38.601 "c2h_success": true, 00:06:38.601 "sock_priority": 0, 00:06:38.601 "abort_timeout_sec": 1, 00:06:38.601 "ack_timeout": 0, 00:06:38.601 "data_wr_pool_size": 0 00:06:38.601 } 00:06:38.601 } 00:06:38.601 ] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "nbd", 00:06:38.601 "config": [] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "ublk", 00:06:38.601 "config": [] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "vhost_blk", 00:06:38.601 "config": [] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "scsi", 00:06:38.601 "config": null 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "iscsi", 00:06:38.601 "config": [ 00:06:38.601 { 00:06:38.601 "method": "iscsi_set_options", 00:06:38.601 "params": { 00:06:38.601 "node_base": "iqn.2016-06.io.spdk", 00:06:38.601 "max_sessions": 128, 00:06:38.601 "max_connections_per_session": 2, 00:06:38.601 "max_queue_depth": 64, 00:06:38.601 "default_time2wait": 2, 00:06:38.601 "default_time2retain": 20, 00:06:38.601 "first_burst_length": 8192, 00:06:38.601 "immediate_data": true, 00:06:38.601 "allow_duplicated_isid": false, 00:06:38.601 "error_recovery_level": 0, 00:06:38.601 "nop_timeout": 60, 00:06:38.601 "nop_in_interval": 30, 00:06:38.601 "disable_chap": false, 00:06:38.601 "require_chap": false, 00:06:38.601 "mutual_chap": false, 00:06:38.601 "chap_group": 0, 00:06:38.601 "max_large_datain_per_connection": 64, 00:06:38.601 "max_r2t_per_connection": 4, 00:06:38.601 "pdu_pool_size": 36864, 00:06:38.601 "immediate_data_pool_size": 16384, 00:06:38.601 "data_out_pool_size": 2048 00:06:38.601 } 00:06:38.601 } 00:06:38.601 ] 00:06:38.601 }, 00:06:38.601 { 00:06:38.601 "subsystem": "vhost_scsi", 00:06:38.601 "config": [] 00:06:38.601 } 00:06:38.601 ] 00:06:38.601 } 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 811428 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 811428 ']' 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 811428 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 811428 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 811428' 00:06:38.601 killing process with pid 811428 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 811428 00:06:38.601 01:22:24 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 811428 00:06:38.861 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=811459 00:06:38.861 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:38.861 01:22:24 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 811459 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 811459 ']' 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 811459 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 811459 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 811459' 00:06:44.129 killing process with pid 811459 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 811459 00:06:44.129 01:22:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 811459 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:44.388 00:06:44.388 real 0m6.252s 00:06:44.388 user 0m5.915s 00:06:44.388 sys 0m0.632s 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:44.388 ************************************ 00:06:44.388 END TEST skip_rpc_with_json 00:06:44.388 ************************************ 00:06:44.388 01:22:30 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:44.388 01:22:30 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:44.388 01:22:30 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:44.388 01:22:30 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:44.388 ************************************ 00:06:44.388 START TEST skip_rpc_with_delay 00:06:44.388 ************************************ 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:44.388 [2024-12-17 01:22:30.308440] app.c: 840:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:44.388 [2024-12-17 01:22:30.308574] app.c: 719:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:44.388 00:06:44.388 real 0m0.047s 00:06:44.388 user 0m0.025s 00:06:44.388 sys 0m0.022s 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:44.388 01:22:30 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:44.388 ************************************ 00:06:44.388 END TEST skip_rpc_with_delay 00:06:44.388 ************************************ 00:06:44.388 01:22:30 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:44.388 01:22:30 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:44.388 01:22:30 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:44.388 01:22:30 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:44.388 01:22:30 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:44.388 01:22:30 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:44.647 ************************************ 00:06:44.647 START TEST exit_on_failed_rpc_init 00:06:44.647 ************************************ 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=812557 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 812557 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 812557 ']' 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:44.647 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:44.647 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:44.647 [2024-12-17 01:22:30.439178] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:44.647 [2024-12-17 01:22:30.439255] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid812557 ] 00:06:44.647 [2024-12-17 01:22:30.509454] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.647 [2024-12-17 01:22:30.551841] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:44.906 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:44.906 [2024-12-17 01:22:30.780875] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:44.906 [2024-12-17 01:22:30.780947] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid812569 ] 00:06:44.906 [2024-12-17 01:22:30.846608] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.906 [2024-12-17 01:22:30.885524] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:44.906 [2024-12-17 01:22:30.885627] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:44.906 [2024-12-17 01:22:30.885641] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:44.906 [2024-12-17 01:22:30.885649] app.c:1061:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 812557 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 812557 ']' 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 812557 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:45.165 01:22:30 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 812557 00:06:45.165 01:22:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:45.165 01:22:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:45.165 01:22:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 812557' 00:06:45.165 killing process with pid 812557 00:06:45.165 01:22:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 812557 00:06:45.165 01:22:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 812557 00:06:45.424 00:06:45.424 real 0m0.902s 00:06:45.424 user 0m0.897s 00:06:45.424 sys 0m0.440s 00:06:45.424 01:22:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:45.424 01:22:31 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:45.424 ************************************ 00:06:45.424 END TEST exit_on_failed_rpc_init 00:06:45.424 ************************************ 00:06:45.424 01:22:31 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:45.424 00:06:45.424 real 0m13.103s 00:06:45.424 user 0m12.198s 00:06:45.424 sys 0m1.704s 00:06:45.424 01:22:31 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:45.424 01:22:31 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:45.424 ************************************ 00:06:45.424 END TEST skip_rpc 00:06:45.424 ************************************ 00:06:45.424 01:22:31 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:45.424 01:22:31 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:45.424 01:22:31 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:45.424 01:22:31 -- common/autotest_common.sh@10 -- # set +x 00:06:45.683 ************************************ 00:06:45.683 START TEST rpc_client 00:06:45.683 ************************************ 00:06:45.683 01:22:31 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:45.683 * Looking for test storage... 00:06:45.683 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:45.683 01:22:31 rpc_client -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:45.683 01:22:31 rpc_client -- common/autotest_common.sh@1681 -- # lcov --version 00:06:45.683 01:22:31 rpc_client -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:45.683 01:22:31 rpc_client -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@345 -- # : 1 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@353 -- # local d=1 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@355 -- # echo 1 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@353 -- # local d=2 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@355 -- # echo 2 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:45.683 01:22:31 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:45.684 01:22:31 rpc_client -- scripts/common.sh@368 -- # return 0 00:06:45.684 01:22:31 rpc_client -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:45.684 01:22:31 rpc_client -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:45.684 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.684 --rc genhtml_branch_coverage=1 00:06:45.684 --rc genhtml_function_coverage=1 00:06:45.684 --rc genhtml_legend=1 00:06:45.684 --rc geninfo_all_blocks=1 00:06:45.684 --rc geninfo_unexecuted_blocks=1 00:06:45.684 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.684 ' 00:06:45.684 01:22:31 rpc_client -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:45.684 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.684 --rc genhtml_branch_coverage=1 00:06:45.684 --rc genhtml_function_coverage=1 00:06:45.684 --rc genhtml_legend=1 00:06:45.684 --rc geninfo_all_blocks=1 00:06:45.684 --rc geninfo_unexecuted_blocks=1 00:06:45.684 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.684 ' 00:06:45.684 01:22:31 rpc_client -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:45.684 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.684 --rc genhtml_branch_coverage=1 00:06:45.684 --rc genhtml_function_coverage=1 00:06:45.684 --rc genhtml_legend=1 00:06:45.684 --rc geninfo_all_blocks=1 00:06:45.684 --rc geninfo_unexecuted_blocks=1 00:06:45.684 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.684 ' 00:06:45.684 01:22:31 rpc_client -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:45.684 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.684 --rc genhtml_branch_coverage=1 00:06:45.684 --rc genhtml_function_coverage=1 00:06:45.684 --rc genhtml_legend=1 00:06:45.684 --rc geninfo_all_blocks=1 00:06:45.684 --rc geninfo_unexecuted_blocks=1 00:06:45.684 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.684 ' 00:06:45.684 01:22:31 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:45.684 OK 00:06:45.684 01:22:31 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:45.684 00:06:45.684 real 0m0.195s 00:06:45.684 user 0m0.104s 00:06:45.684 sys 0m0.109s 00:06:45.684 01:22:31 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:45.684 01:22:31 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:45.684 ************************************ 00:06:45.684 END TEST rpc_client 00:06:45.684 ************************************ 00:06:45.684 01:22:31 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:45.684 01:22:31 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:45.684 01:22:31 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:45.684 01:22:31 -- common/autotest_common.sh@10 -- # set +x 00:06:45.943 ************************************ 00:06:45.943 START TEST json_config 00:06:45.943 ************************************ 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1681 -- # lcov --version 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:45.943 01:22:31 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:45.943 01:22:31 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:45.943 01:22:31 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:45.943 01:22:31 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:06:45.943 01:22:31 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:06:45.943 01:22:31 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:06:45.943 01:22:31 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:06:45.943 01:22:31 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:06:45.943 01:22:31 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:06:45.943 01:22:31 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:06:45.943 01:22:31 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:45.943 01:22:31 json_config -- scripts/common.sh@344 -- # case "$op" in 00:06:45.943 01:22:31 json_config -- scripts/common.sh@345 -- # : 1 00:06:45.943 01:22:31 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:45.943 01:22:31 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:45.943 01:22:31 json_config -- scripts/common.sh@365 -- # decimal 1 00:06:45.943 01:22:31 json_config -- scripts/common.sh@353 -- # local d=1 00:06:45.943 01:22:31 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:45.943 01:22:31 json_config -- scripts/common.sh@355 -- # echo 1 00:06:45.943 01:22:31 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:06:45.943 01:22:31 json_config -- scripts/common.sh@366 -- # decimal 2 00:06:45.943 01:22:31 json_config -- scripts/common.sh@353 -- # local d=2 00:06:45.943 01:22:31 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:45.943 01:22:31 json_config -- scripts/common.sh@355 -- # echo 2 00:06:45.943 01:22:31 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:06:45.943 01:22:31 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:45.943 01:22:31 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:45.943 01:22:31 json_config -- scripts/common.sh@368 -- # return 0 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:45.943 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.943 --rc genhtml_branch_coverage=1 00:06:45.943 --rc genhtml_function_coverage=1 00:06:45.943 --rc genhtml_legend=1 00:06:45.943 --rc geninfo_all_blocks=1 00:06:45.943 --rc geninfo_unexecuted_blocks=1 00:06:45.943 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.943 ' 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:45.943 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.943 --rc genhtml_branch_coverage=1 00:06:45.943 --rc genhtml_function_coverage=1 00:06:45.943 --rc genhtml_legend=1 00:06:45.943 --rc geninfo_all_blocks=1 00:06:45.943 --rc geninfo_unexecuted_blocks=1 00:06:45.943 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.943 ' 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:45.943 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.943 --rc genhtml_branch_coverage=1 00:06:45.943 --rc genhtml_function_coverage=1 00:06:45.943 --rc genhtml_legend=1 00:06:45.943 --rc geninfo_all_blocks=1 00:06:45.943 --rc geninfo_unexecuted_blocks=1 00:06:45.943 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.943 ' 00:06:45.943 01:22:31 json_config -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:45.943 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.943 --rc genhtml_branch_coverage=1 00:06:45.943 --rc genhtml_function_coverage=1 00:06:45.943 --rc genhtml_legend=1 00:06:45.943 --rc geninfo_all_blocks=1 00:06:45.943 --rc geninfo_unexecuted_blocks=1 00:06:45.943 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.943 ' 00:06:45.943 01:22:31 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:45.943 01:22:31 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:45.943 01:22:31 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:45.943 01:22:31 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:45.943 01:22:31 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:45.943 01:22:31 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:45.943 01:22:31 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:45.943 01:22:31 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:45.944 01:22:31 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:06:45.944 01:22:31 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:45.944 01:22:31 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:45.944 01:22:31 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:45.944 01:22:31 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:45.944 01:22:31 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:45.944 01:22:31 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:45.944 01:22:31 json_config -- paths/export.sh@5 -- # export PATH 00:06:45.944 01:22:31 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@51 -- # : 0 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:45.944 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:45.944 01:22:31 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:45.944 01:22:31 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:45.944 01:22:31 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:45.944 01:22:31 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:45.944 01:22:31 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:45.944 01:22:31 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:45.944 01:22:31 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:45.944 WARNING: No tests are enabled so not running JSON configuration tests 00:06:45.944 01:22:31 json_config -- json_config/json_config.sh@28 -- # exit 0 00:06:45.944 00:06:45.944 real 0m0.179s 00:06:45.944 user 0m0.096s 00:06:45.944 sys 0m0.091s 00:06:45.944 01:22:31 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:45.944 01:22:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:45.944 ************************************ 00:06:45.944 END TEST json_config 00:06:45.944 ************************************ 00:06:45.944 01:22:31 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:45.944 01:22:31 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:45.944 01:22:31 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:45.944 01:22:31 -- common/autotest_common.sh@10 -- # set +x 00:06:46.204 ************************************ 00:06:46.204 START TEST json_config_extra_key 00:06:46.204 ************************************ 00:06:46.204 01:22:31 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1681 -- # lcov --version 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:46.204 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.204 --rc genhtml_branch_coverage=1 00:06:46.204 --rc genhtml_function_coverage=1 00:06:46.204 --rc genhtml_legend=1 00:06:46.204 --rc geninfo_all_blocks=1 00:06:46.204 --rc geninfo_unexecuted_blocks=1 00:06:46.204 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.204 ' 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:46.204 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.204 --rc genhtml_branch_coverage=1 00:06:46.204 --rc genhtml_function_coverage=1 00:06:46.204 --rc genhtml_legend=1 00:06:46.204 --rc geninfo_all_blocks=1 00:06:46.204 --rc geninfo_unexecuted_blocks=1 00:06:46.204 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.204 ' 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:46.204 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.204 --rc genhtml_branch_coverage=1 00:06:46.204 --rc genhtml_function_coverage=1 00:06:46.204 --rc genhtml_legend=1 00:06:46.204 --rc geninfo_all_blocks=1 00:06:46.204 --rc geninfo_unexecuted_blocks=1 00:06:46.204 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.204 ' 00:06:46.204 01:22:32 json_config_extra_key -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:46.204 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.204 --rc genhtml_branch_coverage=1 00:06:46.204 --rc genhtml_function_coverage=1 00:06:46.204 --rc genhtml_legend=1 00:06:46.204 --rc geninfo_all_blocks=1 00:06:46.204 --rc geninfo_unexecuted_blocks=1 00:06:46.204 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.204 ' 00:06:46.204 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:46.204 01:22:32 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:46.204 01:22:32 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:46.204 01:22:32 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:46.205 01:22:32 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:46.205 01:22:32 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:46.205 01:22:32 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:46.205 01:22:32 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:46.205 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:46.205 01:22:32 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:46.205 INFO: launching applications... 00:06:46.205 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=813003 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:46.205 Waiting for target to run... 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 813003 /var/tmp/spdk_tgt.sock 00:06:46.205 01:22:32 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 813003 ']' 00:06:46.205 01:22:32 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:46.205 01:22:32 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:46.205 01:22:32 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:46.205 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:46.205 01:22:32 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:46.205 01:22:32 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:46.205 01:22:32 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:46.205 [2024-12-17 01:22:32.164046] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:46.205 [2024-12-17 01:22:32.164109] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid813003 ] 00:06:46.464 [2024-12-17 01:22:32.443051] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.464 [2024-12-17 01:22:32.464830] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.031 01:22:32 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:47.032 01:22:32 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:47.032 00:06:47.032 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:47.032 INFO: shutting down applications... 00:06:47.032 01:22:32 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 813003 ]] 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 813003 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 813003 00:06:47.032 01:22:32 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:47.599 01:22:33 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:47.599 01:22:33 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:47.599 01:22:33 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 813003 00:06:47.599 01:22:33 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:47.599 01:22:33 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:47.599 01:22:33 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:47.599 01:22:33 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:47.599 SPDK target shutdown done 00:06:47.600 01:22:33 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:47.600 Success 00:06:47.600 00:06:47.600 real 0m1.547s 00:06:47.600 user 0m1.282s 00:06:47.600 sys 0m0.424s 00:06:47.600 01:22:33 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:47.600 01:22:33 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:47.600 ************************************ 00:06:47.600 END TEST json_config_extra_key 00:06:47.600 ************************************ 00:06:47.600 01:22:33 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:47.600 01:22:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:47.600 01:22:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:47.600 01:22:33 -- common/autotest_common.sh@10 -- # set +x 00:06:47.600 ************************************ 00:06:47.600 START TEST alias_rpc 00:06:47.600 ************************************ 00:06:47.600 01:22:33 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:47.859 * Looking for test storage... 00:06:47.859 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@345 -- # : 1 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:47.859 01:22:33 alias_rpc -- scripts/common.sh@368 -- # return 0 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:47.859 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.859 --rc genhtml_branch_coverage=1 00:06:47.859 --rc genhtml_function_coverage=1 00:06:47.859 --rc genhtml_legend=1 00:06:47.859 --rc geninfo_all_blocks=1 00:06:47.859 --rc geninfo_unexecuted_blocks=1 00:06:47.859 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.859 ' 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:47.859 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.859 --rc genhtml_branch_coverage=1 00:06:47.859 --rc genhtml_function_coverage=1 00:06:47.859 --rc genhtml_legend=1 00:06:47.859 --rc geninfo_all_blocks=1 00:06:47.859 --rc geninfo_unexecuted_blocks=1 00:06:47.859 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.859 ' 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:47.859 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.859 --rc genhtml_branch_coverage=1 00:06:47.859 --rc genhtml_function_coverage=1 00:06:47.859 --rc genhtml_legend=1 00:06:47.859 --rc geninfo_all_blocks=1 00:06:47.859 --rc geninfo_unexecuted_blocks=1 00:06:47.859 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.859 ' 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:47.859 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.859 --rc genhtml_branch_coverage=1 00:06:47.859 --rc genhtml_function_coverage=1 00:06:47.859 --rc genhtml_legend=1 00:06:47.859 --rc geninfo_all_blocks=1 00:06:47.859 --rc geninfo_unexecuted_blocks=1 00:06:47.859 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.859 ' 00:06:47.859 01:22:33 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:47.859 01:22:33 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=813332 00:06:47.859 01:22:33 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 813332 00:06:47.859 01:22:33 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 813332 ']' 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:47.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:47.859 01:22:33 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:47.859 [2024-12-17 01:22:33.774616] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:47.859 [2024-12-17 01:22:33.774676] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid813332 ] 00:06:47.859 [2024-12-17 01:22:33.839545] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.118 [2024-12-17 01:22:33.877634] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.118 01:22:34 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:48.118 01:22:34 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:06:48.118 01:22:34 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:48.377 01:22:34 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 813332 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 813332 ']' 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 813332 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 813332 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 813332' 00:06:48.377 killing process with pid 813332 00:06:48.377 01:22:34 alias_rpc -- common/autotest_common.sh@969 -- # kill 813332 00:06:48.378 01:22:34 alias_rpc -- common/autotest_common.sh@974 -- # wait 813332 00:06:48.946 00:06:48.946 real 0m1.079s 00:06:48.946 user 0m1.061s 00:06:48.946 sys 0m0.448s 00:06:48.946 01:22:34 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:48.946 01:22:34 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.946 ************************************ 00:06:48.946 END TEST alias_rpc 00:06:48.946 ************************************ 00:06:48.946 01:22:34 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:06:48.946 01:22:34 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:48.946 01:22:34 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:48.946 01:22:34 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:48.946 01:22:34 -- common/autotest_common.sh@10 -- # set +x 00:06:48.946 ************************************ 00:06:48.946 START TEST spdkcli_tcp 00:06:48.946 ************************************ 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:48.946 * Looking for test storage... 00:06:48.946 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lcov --version 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:48.946 01:22:34 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:48.946 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.946 --rc genhtml_branch_coverage=1 00:06:48.946 --rc genhtml_function_coverage=1 00:06:48.946 --rc genhtml_legend=1 00:06:48.946 --rc geninfo_all_blocks=1 00:06:48.946 --rc geninfo_unexecuted_blocks=1 00:06:48.946 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.946 ' 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:48.946 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.946 --rc genhtml_branch_coverage=1 00:06:48.946 --rc genhtml_function_coverage=1 00:06:48.946 --rc genhtml_legend=1 00:06:48.946 --rc geninfo_all_blocks=1 00:06:48.946 --rc geninfo_unexecuted_blocks=1 00:06:48.946 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.946 ' 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:48.946 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.946 --rc genhtml_branch_coverage=1 00:06:48.946 --rc genhtml_function_coverage=1 00:06:48.946 --rc genhtml_legend=1 00:06:48.946 --rc geninfo_all_blocks=1 00:06:48.946 --rc geninfo_unexecuted_blocks=1 00:06:48.946 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.946 ' 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:48.946 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:48.946 --rc genhtml_branch_coverage=1 00:06:48.946 --rc genhtml_function_coverage=1 00:06:48.946 --rc genhtml_legend=1 00:06:48.946 --rc geninfo_all_blocks=1 00:06:48.946 --rc geninfo_unexecuted_blocks=1 00:06:48.946 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:48.946 ' 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=813656 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 813656 00:06:48.946 01:22:34 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 813656 ']' 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.946 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:48.946 01:22:34 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:48.946 [2024-12-17 01:22:34.948390] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:48.947 [2024-12-17 01:22:34.948451] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid813656 ] 00:06:49.205 [2024-12-17 01:22:35.012838] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:49.205 [2024-12-17 01:22:35.051229] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:49.205 [2024-12-17 01:22:35.051230] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.465 01:22:35 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:49.465 01:22:35 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:06:49.465 01:22:35 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=813660 00:06:49.465 01:22:35 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:49.465 01:22:35 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:49.465 [ 00:06:49.465 "spdk_get_version", 00:06:49.465 "rpc_get_methods", 00:06:49.465 "notify_get_notifications", 00:06:49.465 "notify_get_types", 00:06:49.465 "trace_get_info", 00:06:49.465 "trace_get_tpoint_group_mask", 00:06:49.465 "trace_disable_tpoint_group", 00:06:49.465 "trace_enable_tpoint_group", 00:06:49.465 "trace_clear_tpoint_mask", 00:06:49.465 "trace_set_tpoint_mask", 00:06:49.465 "fsdev_set_opts", 00:06:49.465 "fsdev_get_opts", 00:06:49.465 "framework_get_pci_devices", 00:06:49.465 "framework_get_config", 00:06:49.465 "framework_get_subsystems", 00:06:49.465 "vfu_tgt_set_base_path", 00:06:49.465 "keyring_get_keys", 00:06:49.465 "iobuf_get_stats", 00:06:49.465 "iobuf_set_options", 00:06:49.465 "sock_get_default_impl", 00:06:49.465 "sock_set_default_impl", 00:06:49.465 "sock_impl_set_options", 00:06:49.465 "sock_impl_get_options", 00:06:49.465 "vmd_rescan", 00:06:49.465 "vmd_remove_device", 00:06:49.465 "vmd_enable", 00:06:49.465 "accel_get_stats", 00:06:49.465 "accel_set_options", 00:06:49.465 "accel_set_driver", 00:06:49.465 "accel_crypto_key_destroy", 00:06:49.465 "accel_crypto_keys_get", 00:06:49.465 "accel_crypto_key_create", 00:06:49.465 "accel_assign_opc", 00:06:49.465 "accel_get_module_info", 00:06:49.465 "accel_get_opc_assignments", 00:06:49.465 "bdev_get_histogram", 00:06:49.465 "bdev_enable_histogram", 00:06:49.465 "bdev_set_qos_limit", 00:06:49.465 "bdev_set_qd_sampling_period", 00:06:49.465 "bdev_get_bdevs", 00:06:49.465 "bdev_reset_iostat", 00:06:49.465 "bdev_get_iostat", 00:06:49.465 "bdev_examine", 00:06:49.465 "bdev_wait_for_examine", 00:06:49.465 "bdev_set_options", 00:06:49.465 "scsi_get_devices", 00:06:49.465 "thread_set_cpumask", 00:06:49.465 "scheduler_set_options", 00:06:49.465 "framework_get_governor", 00:06:49.465 "framework_get_scheduler", 00:06:49.465 "framework_set_scheduler", 00:06:49.465 "framework_get_reactors", 00:06:49.465 "thread_get_io_channels", 00:06:49.465 "thread_get_pollers", 00:06:49.465 "thread_get_stats", 00:06:49.465 "framework_monitor_context_switch", 00:06:49.465 "spdk_kill_instance", 00:06:49.465 "log_enable_timestamps", 00:06:49.465 "log_get_flags", 00:06:49.465 "log_clear_flag", 00:06:49.465 "log_set_flag", 00:06:49.465 "log_get_level", 00:06:49.465 "log_set_level", 00:06:49.465 "log_get_print_level", 00:06:49.465 "log_set_print_level", 00:06:49.465 "framework_enable_cpumask_locks", 00:06:49.465 "framework_disable_cpumask_locks", 00:06:49.465 "framework_wait_init", 00:06:49.465 "framework_start_init", 00:06:49.465 "virtio_blk_create_transport", 00:06:49.465 "virtio_blk_get_transports", 00:06:49.465 "vhost_controller_set_coalescing", 00:06:49.465 "vhost_get_controllers", 00:06:49.465 "vhost_delete_controller", 00:06:49.465 "vhost_create_blk_controller", 00:06:49.465 "vhost_scsi_controller_remove_target", 00:06:49.465 "vhost_scsi_controller_add_target", 00:06:49.465 "vhost_start_scsi_controller", 00:06:49.465 "vhost_create_scsi_controller", 00:06:49.465 "ublk_recover_disk", 00:06:49.465 "ublk_get_disks", 00:06:49.465 "ublk_stop_disk", 00:06:49.465 "ublk_start_disk", 00:06:49.465 "ublk_destroy_target", 00:06:49.465 "ublk_create_target", 00:06:49.465 "nbd_get_disks", 00:06:49.465 "nbd_stop_disk", 00:06:49.465 "nbd_start_disk", 00:06:49.465 "env_dpdk_get_mem_stats", 00:06:49.465 "nvmf_stop_mdns_prr", 00:06:49.465 "nvmf_publish_mdns_prr", 00:06:49.465 "nvmf_subsystem_get_listeners", 00:06:49.465 "nvmf_subsystem_get_qpairs", 00:06:49.465 "nvmf_subsystem_get_controllers", 00:06:49.465 "nvmf_get_stats", 00:06:49.465 "nvmf_get_transports", 00:06:49.465 "nvmf_create_transport", 00:06:49.465 "nvmf_get_targets", 00:06:49.465 "nvmf_delete_target", 00:06:49.465 "nvmf_create_target", 00:06:49.465 "nvmf_subsystem_allow_any_host", 00:06:49.465 "nvmf_subsystem_set_keys", 00:06:49.465 "nvmf_subsystem_remove_host", 00:06:49.465 "nvmf_subsystem_add_host", 00:06:49.465 "nvmf_ns_remove_host", 00:06:49.465 "nvmf_ns_add_host", 00:06:49.465 "nvmf_subsystem_remove_ns", 00:06:49.465 "nvmf_subsystem_set_ns_ana_group", 00:06:49.465 "nvmf_subsystem_add_ns", 00:06:49.465 "nvmf_subsystem_listener_set_ana_state", 00:06:49.465 "nvmf_discovery_get_referrals", 00:06:49.465 "nvmf_discovery_remove_referral", 00:06:49.465 "nvmf_discovery_add_referral", 00:06:49.465 "nvmf_subsystem_remove_listener", 00:06:49.465 "nvmf_subsystem_add_listener", 00:06:49.465 "nvmf_delete_subsystem", 00:06:49.465 "nvmf_create_subsystem", 00:06:49.465 "nvmf_get_subsystems", 00:06:49.465 "nvmf_set_crdt", 00:06:49.465 "nvmf_set_config", 00:06:49.465 "nvmf_set_max_subsystems", 00:06:49.465 "iscsi_get_histogram", 00:06:49.465 "iscsi_enable_histogram", 00:06:49.465 "iscsi_set_options", 00:06:49.465 "iscsi_get_auth_groups", 00:06:49.465 "iscsi_auth_group_remove_secret", 00:06:49.465 "iscsi_auth_group_add_secret", 00:06:49.465 "iscsi_delete_auth_group", 00:06:49.465 "iscsi_create_auth_group", 00:06:49.465 "iscsi_set_discovery_auth", 00:06:49.465 "iscsi_get_options", 00:06:49.465 "iscsi_target_node_request_logout", 00:06:49.465 "iscsi_target_node_set_redirect", 00:06:49.465 "iscsi_target_node_set_auth", 00:06:49.465 "iscsi_target_node_add_lun", 00:06:49.465 "iscsi_get_stats", 00:06:49.465 "iscsi_get_connections", 00:06:49.465 "iscsi_portal_group_set_auth", 00:06:49.465 "iscsi_start_portal_group", 00:06:49.465 "iscsi_delete_portal_group", 00:06:49.465 "iscsi_create_portal_group", 00:06:49.465 "iscsi_get_portal_groups", 00:06:49.465 "iscsi_delete_target_node", 00:06:49.465 "iscsi_target_node_remove_pg_ig_maps", 00:06:49.465 "iscsi_target_node_add_pg_ig_maps", 00:06:49.465 "iscsi_create_target_node", 00:06:49.465 "iscsi_get_target_nodes", 00:06:49.465 "iscsi_delete_initiator_group", 00:06:49.465 "iscsi_initiator_group_remove_initiators", 00:06:49.465 "iscsi_initiator_group_add_initiators", 00:06:49.465 "iscsi_create_initiator_group", 00:06:49.465 "iscsi_get_initiator_groups", 00:06:49.465 "fsdev_aio_delete", 00:06:49.465 "fsdev_aio_create", 00:06:49.465 "keyring_linux_set_options", 00:06:49.465 "keyring_file_remove_key", 00:06:49.465 "keyring_file_add_key", 00:06:49.465 "vfu_virtio_create_fs_endpoint", 00:06:49.465 "vfu_virtio_create_scsi_endpoint", 00:06:49.465 "vfu_virtio_scsi_remove_target", 00:06:49.465 "vfu_virtio_scsi_add_target", 00:06:49.465 "vfu_virtio_create_blk_endpoint", 00:06:49.465 "vfu_virtio_delete_endpoint", 00:06:49.465 "iaa_scan_accel_module", 00:06:49.465 "dsa_scan_accel_module", 00:06:49.465 "ioat_scan_accel_module", 00:06:49.465 "accel_error_inject_error", 00:06:49.465 "bdev_iscsi_delete", 00:06:49.465 "bdev_iscsi_create", 00:06:49.465 "bdev_iscsi_set_options", 00:06:49.465 "bdev_virtio_attach_controller", 00:06:49.465 "bdev_virtio_scsi_get_devices", 00:06:49.465 "bdev_virtio_detach_controller", 00:06:49.465 "bdev_virtio_blk_set_hotplug", 00:06:49.465 "bdev_ftl_set_property", 00:06:49.465 "bdev_ftl_get_properties", 00:06:49.465 "bdev_ftl_get_stats", 00:06:49.465 "bdev_ftl_unmap", 00:06:49.465 "bdev_ftl_unload", 00:06:49.465 "bdev_ftl_delete", 00:06:49.465 "bdev_ftl_load", 00:06:49.465 "bdev_ftl_create", 00:06:49.465 "bdev_aio_delete", 00:06:49.465 "bdev_aio_rescan", 00:06:49.465 "bdev_aio_create", 00:06:49.465 "blobfs_create", 00:06:49.465 "blobfs_detect", 00:06:49.465 "blobfs_set_cache_size", 00:06:49.465 "bdev_zone_block_delete", 00:06:49.465 "bdev_zone_block_create", 00:06:49.465 "bdev_delay_delete", 00:06:49.465 "bdev_delay_create", 00:06:49.465 "bdev_delay_update_latency", 00:06:49.465 "bdev_split_delete", 00:06:49.465 "bdev_split_create", 00:06:49.465 "bdev_error_inject_error", 00:06:49.465 "bdev_error_delete", 00:06:49.465 "bdev_error_create", 00:06:49.465 "bdev_raid_set_options", 00:06:49.465 "bdev_raid_remove_base_bdev", 00:06:49.465 "bdev_raid_add_base_bdev", 00:06:49.465 "bdev_raid_delete", 00:06:49.465 "bdev_raid_create", 00:06:49.465 "bdev_raid_get_bdevs", 00:06:49.465 "bdev_lvol_set_parent_bdev", 00:06:49.465 "bdev_lvol_set_parent", 00:06:49.465 "bdev_lvol_check_shallow_copy", 00:06:49.465 "bdev_lvol_start_shallow_copy", 00:06:49.465 "bdev_lvol_grow_lvstore", 00:06:49.465 "bdev_lvol_get_lvols", 00:06:49.465 "bdev_lvol_get_lvstores", 00:06:49.465 "bdev_lvol_delete", 00:06:49.465 "bdev_lvol_set_read_only", 00:06:49.465 "bdev_lvol_resize", 00:06:49.465 "bdev_lvol_decouple_parent", 00:06:49.465 "bdev_lvol_inflate", 00:06:49.465 "bdev_lvol_rename", 00:06:49.465 "bdev_lvol_clone_bdev", 00:06:49.465 "bdev_lvol_clone", 00:06:49.465 "bdev_lvol_snapshot", 00:06:49.465 "bdev_lvol_create", 00:06:49.465 "bdev_lvol_delete_lvstore", 00:06:49.466 "bdev_lvol_rename_lvstore", 00:06:49.466 "bdev_lvol_create_lvstore", 00:06:49.466 "bdev_passthru_delete", 00:06:49.466 "bdev_passthru_create", 00:06:49.466 "bdev_nvme_cuse_unregister", 00:06:49.466 "bdev_nvme_cuse_register", 00:06:49.466 "bdev_opal_new_user", 00:06:49.466 "bdev_opal_set_lock_state", 00:06:49.466 "bdev_opal_delete", 00:06:49.466 "bdev_opal_get_info", 00:06:49.466 "bdev_opal_create", 00:06:49.466 "bdev_nvme_opal_revert", 00:06:49.466 "bdev_nvme_opal_init", 00:06:49.466 "bdev_nvme_send_cmd", 00:06:49.466 "bdev_nvme_set_keys", 00:06:49.466 "bdev_nvme_get_path_iostat", 00:06:49.466 "bdev_nvme_get_mdns_discovery_info", 00:06:49.466 "bdev_nvme_stop_mdns_discovery", 00:06:49.466 "bdev_nvme_start_mdns_discovery", 00:06:49.466 "bdev_nvme_set_multipath_policy", 00:06:49.466 "bdev_nvme_set_preferred_path", 00:06:49.466 "bdev_nvme_get_io_paths", 00:06:49.466 "bdev_nvme_remove_error_injection", 00:06:49.466 "bdev_nvme_add_error_injection", 00:06:49.466 "bdev_nvme_get_discovery_info", 00:06:49.466 "bdev_nvme_stop_discovery", 00:06:49.466 "bdev_nvme_start_discovery", 00:06:49.466 "bdev_nvme_get_controller_health_info", 00:06:49.466 "bdev_nvme_disable_controller", 00:06:49.466 "bdev_nvme_enable_controller", 00:06:49.466 "bdev_nvme_reset_controller", 00:06:49.466 "bdev_nvme_get_transport_statistics", 00:06:49.466 "bdev_nvme_apply_firmware", 00:06:49.466 "bdev_nvme_detach_controller", 00:06:49.466 "bdev_nvme_get_controllers", 00:06:49.466 "bdev_nvme_attach_controller", 00:06:49.466 "bdev_nvme_set_hotplug", 00:06:49.466 "bdev_nvme_set_options", 00:06:49.466 "bdev_null_resize", 00:06:49.466 "bdev_null_delete", 00:06:49.466 "bdev_null_create", 00:06:49.466 "bdev_malloc_delete", 00:06:49.466 "bdev_malloc_create" 00:06:49.466 ] 00:06:49.466 01:22:35 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:49.466 01:22:35 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:49.466 01:22:35 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:49.724 01:22:35 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:49.724 01:22:35 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 813656 00:06:49.724 01:22:35 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 813656 ']' 00:06:49.724 01:22:35 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 813656 00:06:49.724 01:22:35 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:06:49.724 01:22:35 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:49.725 01:22:35 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 813656 00:06:49.725 01:22:35 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:49.725 01:22:35 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:49.725 01:22:35 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 813656' 00:06:49.725 killing process with pid 813656 00:06:49.725 01:22:35 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 813656 00:06:49.725 01:22:35 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 813656 00:06:49.983 00:06:49.983 real 0m1.116s 00:06:49.983 user 0m1.822s 00:06:49.983 sys 0m0.488s 00:06:49.983 01:22:35 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.983 01:22:35 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:49.983 ************************************ 00:06:49.983 END TEST spdkcli_tcp 00:06:49.983 ************************************ 00:06:49.983 01:22:35 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:49.983 01:22:35 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.983 01:22:35 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.983 01:22:35 -- common/autotest_common.sh@10 -- # set +x 00:06:49.983 ************************************ 00:06:49.983 START TEST dpdk_mem_utility 00:06:49.983 ************************************ 00:06:49.983 01:22:35 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:50.242 * Looking for test storage... 00:06:50.242 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:50.242 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:50.242 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lcov --version 00:06:50.242 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:50.242 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:50.242 01:22:36 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:06:50.242 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:50.242 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:50.242 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.242 --rc genhtml_branch_coverage=1 00:06:50.242 --rc genhtml_function_coverage=1 00:06:50.242 --rc genhtml_legend=1 00:06:50.242 --rc geninfo_all_blocks=1 00:06:50.242 --rc geninfo_unexecuted_blocks=1 00:06:50.242 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.242 ' 00:06:50.242 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:50.243 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.243 --rc genhtml_branch_coverage=1 00:06:50.243 --rc genhtml_function_coverage=1 00:06:50.243 --rc genhtml_legend=1 00:06:50.243 --rc geninfo_all_blocks=1 00:06:50.243 --rc geninfo_unexecuted_blocks=1 00:06:50.243 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.243 ' 00:06:50.243 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:50.243 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.243 --rc genhtml_branch_coverage=1 00:06:50.243 --rc genhtml_function_coverage=1 00:06:50.243 --rc genhtml_legend=1 00:06:50.243 --rc geninfo_all_blocks=1 00:06:50.243 --rc geninfo_unexecuted_blocks=1 00:06:50.243 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.243 ' 00:06:50.243 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:50.243 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.243 --rc genhtml_branch_coverage=1 00:06:50.243 --rc genhtml_function_coverage=1 00:06:50.243 --rc genhtml_legend=1 00:06:50.243 --rc geninfo_all_blocks=1 00:06:50.243 --rc geninfo_unexecuted_blocks=1 00:06:50.243 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.243 ' 00:06:50.243 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:50.243 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=813993 00:06:50.243 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 813993 00:06:50.243 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.243 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 813993 ']' 00:06:50.243 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.243 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:50.243 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.243 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.243 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:50.243 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:50.243 [2024-12-17 01:22:36.135805] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:50.243 [2024-12-17 01:22:36.135882] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid813993 ] 00:06:50.243 [2024-12-17 01:22:36.197991] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.243 [2024-12-17 01:22:36.235904] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.502 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:50.502 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:06:50.502 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:50.502 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:50.502 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.502 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:50.502 { 00:06:50.502 "filename": "/tmp/spdk_mem_dump.txt" 00:06:50.502 } 00:06:50.502 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.502 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:50.502 DPDK memory size 860.000000 MiB in 1 heap(s) 00:06:50.502 1 heaps totaling size 860.000000 MiB 00:06:50.502 size: 860.000000 MiB heap id: 0 00:06:50.502 end heaps---------- 00:06:50.502 9 mempools totaling size 642.649841 MiB 00:06:50.502 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:50.502 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:50.502 size: 92.545471 MiB name: bdev_io_813993 00:06:50.502 size: 51.011292 MiB name: evtpool_813993 00:06:50.502 size: 50.003479 MiB name: msgpool_813993 00:06:50.502 size: 36.509338 MiB name: fsdev_io_813993 00:06:50.502 size: 21.763794 MiB name: PDU_Pool 00:06:50.502 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:50.502 size: 0.026123 MiB name: Session_Pool 00:06:50.502 end mempools------- 00:06:50.502 6 memzones totaling size 4.142822 MiB 00:06:50.502 size: 1.000366 MiB name: RG_ring_0_813993 00:06:50.502 size: 1.000366 MiB name: RG_ring_1_813993 00:06:50.502 size: 1.000366 MiB name: RG_ring_4_813993 00:06:50.502 size: 1.000366 MiB name: RG_ring_5_813993 00:06:50.502 size: 0.125366 MiB name: RG_ring_2_813993 00:06:50.502 size: 0.015991 MiB name: RG_ring_3_813993 00:06:50.502 end memzones------- 00:06:50.502 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:50.762 heap id: 0 total size: 860.000000 MiB number of busy elements: 44 number of free elements: 16 00:06:50.762 list of free elements. size: 13.984680 MiB 00:06:50.762 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:50.762 element at address: 0x200000800000 with size: 1.996948 MiB 00:06:50.762 element at address: 0x20001bc00000 with size: 0.999878 MiB 00:06:50.762 element at address: 0x20001be00000 with size: 0.999878 MiB 00:06:50.762 element at address: 0x200034a00000 with size: 0.994446 MiB 00:06:50.762 element at address: 0x20000b200000 with size: 0.959839 MiB 00:06:50.762 element at address: 0x200015e00000 with size: 0.954285 MiB 00:06:50.762 element at address: 0x20001c000000 with size: 0.936584 MiB 00:06:50.762 element at address: 0x200000200000 with size: 0.841614 MiB 00:06:50.762 element at address: 0x20001d800000 with size: 0.582886 MiB 00:06:50.762 element at address: 0x200003e00000 with size: 0.495605 MiB 00:06:50.762 element at address: 0x200007000000 with size: 0.490723 MiB 00:06:50.762 element at address: 0x20001c200000 with size: 0.485657 MiB 00:06:50.762 element at address: 0x200013800000 with size: 0.481934 MiB 00:06:50.762 element at address: 0x20002ac00000 with size: 0.410034 MiB 00:06:50.762 element at address: 0x200003a00000 with size: 0.354858 MiB 00:06:50.762 list of standard malloc elements. size: 199.218628 MiB 00:06:50.762 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:50.762 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:50.762 element at address: 0x20001bcfff80 with size: 1.000122 MiB 00:06:50.762 element at address: 0x20001befff80 with size: 1.000122 MiB 00:06:50.762 element at address: 0x20001c0fff80 with size: 1.000122 MiB 00:06:50.762 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:50.762 element at address: 0x20001c0eff00 with size: 0.062622 MiB 00:06:50.762 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:50.762 element at address: 0x20001c0efdc0 with size: 0.000305 MiB 00:06:50.762 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:06:50.762 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:06:50.762 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:06:50.762 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:06:50.762 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:06:50.762 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:50.762 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003a5ad80 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003a5af80 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003adb300 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003adb500 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003affa80 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003affb40 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003e7ee00 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20000707da00 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20000707dac0 with size: 0.000183 MiB 00:06:50.762 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20001387b600 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20001387b6c0 with size: 0.000183 MiB 00:06:50.762 element at address: 0x2000138fb980 with size: 0.000183 MiB 00:06:50.762 element at address: 0x200015ef44c0 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20001c0efc40 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20001c0efd00 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20001c2bc740 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20001d895380 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20001d895440 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20002ac68f80 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20002ac69040 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20002ac6fc40 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20002ac6fe40 with size: 0.000183 MiB 00:06:50.762 element at address: 0x20002ac6ff00 with size: 0.000183 MiB 00:06:50.762 list of memzone associated elements. size: 646.796692 MiB 00:06:50.762 element at address: 0x20001d895500 with size: 211.416748 MiB 00:06:50.762 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:50.762 element at address: 0x20002ac6ffc0 with size: 157.562561 MiB 00:06:50.762 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:50.762 element at address: 0x200015ff4780 with size: 92.045044 MiB 00:06:50.762 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_813993_0 00:06:50.762 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:50.762 associated memzone info: size: 48.002930 MiB name: MP_evtpool_813993_0 00:06:50.762 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:50.762 associated memzone info: size: 48.002930 MiB name: MP_msgpool_813993_0 00:06:50.762 element at address: 0x2000139fdb80 with size: 36.008911 MiB 00:06:50.762 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_813993_0 00:06:50.762 element at address: 0x20001c3be940 with size: 20.255554 MiB 00:06:50.762 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:50.762 element at address: 0x200034bfeb40 with size: 18.005066 MiB 00:06:50.762 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:50.762 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:50.762 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_813993 00:06:50.762 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:50.762 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_813993 00:06:50.762 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:50.762 associated memzone info: size: 1.007996 MiB name: MP_evtpool_813993 00:06:50.762 element at address: 0x2000138fba40 with size: 1.008118 MiB 00:06:50.762 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:50.762 element at address: 0x20001c2bc800 with size: 1.008118 MiB 00:06:50.762 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:50.762 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:50.762 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:50.762 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:50.762 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:50.762 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:50.762 associated memzone info: size: 1.000366 MiB name: RG_ring_0_813993 00:06:50.762 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:50.762 associated memzone info: size: 1.000366 MiB name: RG_ring_1_813993 00:06:50.762 element at address: 0x200015ef4580 with size: 1.000488 MiB 00:06:50.762 associated memzone info: size: 1.000366 MiB name: RG_ring_4_813993 00:06:50.762 element at address: 0x200034afe940 with size: 1.000488 MiB 00:06:50.762 associated memzone info: size: 1.000366 MiB name: RG_ring_5_813993 00:06:50.762 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:06:50.762 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_813993 00:06:50.762 element at address: 0x200003e7eec0 with size: 0.500488 MiB 00:06:50.762 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_813993 00:06:50.762 element at address: 0x20001387b780 with size: 0.500488 MiB 00:06:50.762 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:50.762 element at address: 0x20000707db80 with size: 0.500488 MiB 00:06:50.762 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:50.762 element at address: 0x20001c27c540 with size: 0.250488 MiB 00:06:50.762 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:50.762 element at address: 0x200003adf880 with size: 0.125488 MiB 00:06:50.762 associated memzone info: size: 0.125366 MiB name: RG_ring_2_813993 00:06:50.762 element at address: 0x20000b2f5b80 with size: 0.031738 MiB 00:06:50.762 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:50.762 element at address: 0x20002ac69100 with size: 0.023743 MiB 00:06:50.762 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:50.762 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:06:50.762 associated memzone info: size: 0.015991 MiB name: RG_ring_3_813993 00:06:50.762 element at address: 0x20002ac6f240 with size: 0.002441 MiB 00:06:50.762 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:50.762 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:06:50.762 associated memzone info: size: 0.000183 MiB name: MP_msgpool_813993 00:06:50.762 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:06:50.762 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_813993 00:06:50.762 element at address: 0x200003a5ae40 with size: 0.000305 MiB 00:06:50.762 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_813993 00:06:50.762 element at address: 0x20002ac6fd00 with size: 0.000305 MiB 00:06:50.762 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:50.762 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:50.762 01:22:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 813993 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 813993 ']' 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 813993 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 813993 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 813993' 00:06:50.763 killing process with pid 813993 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 813993 00:06:50.763 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 813993 00:06:51.021 00:06:51.022 real 0m0.955s 00:06:51.022 user 0m0.847s 00:06:51.022 sys 0m0.435s 00:06:51.022 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:51.022 01:22:36 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:51.022 ************************************ 00:06:51.022 END TEST dpdk_mem_utility 00:06:51.022 ************************************ 00:06:51.022 01:22:36 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:51.022 01:22:36 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:51.022 01:22:36 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.022 01:22:36 -- common/autotest_common.sh@10 -- # set +x 00:06:51.022 ************************************ 00:06:51.022 START TEST event 00:06:51.022 ************************************ 00:06:51.022 01:22:36 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:51.281 * Looking for test storage... 00:06:51.281 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1681 -- # lcov --version 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:51.281 01:22:37 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:51.281 01:22:37 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:51.281 01:22:37 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:51.281 01:22:37 event -- scripts/common.sh@336 -- # IFS=.-: 00:06:51.281 01:22:37 event -- scripts/common.sh@336 -- # read -ra ver1 00:06:51.281 01:22:37 event -- scripts/common.sh@337 -- # IFS=.-: 00:06:51.281 01:22:37 event -- scripts/common.sh@337 -- # read -ra ver2 00:06:51.281 01:22:37 event -- scripts/common.sh@338 -- # local 'op=<' 00:06:51.281 01:22:37 event -- scripts/common.sh@340 -- # ver1_l=2 00:06:51.281 01:22:37 event -- scripts/common.sh@341 -- # ver2_l=1 00:06:51.281 01:22:37 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:51.281 01:22:37 event -- scripts/common.sh@344 -- # case "$op" in 00:06:51.281 01:22:37 event -- scripts/common.sh@345 -- # : 1 00:06:51.281 01:22:37 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:51.281 01:22:37 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:51.281 01:22:37 event -- scripts/common.sh@365 -- # decimal 1 00:06:51.281 01:22:37 event -- scripts/common.sh@353 -- # local d=1 00:06:51.281 01:22:37 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:51.281 01:22:37 event -- scripts/common.sh@355 -- # echo 1 00:06:51.281 01:22:37 event -- scripts/common.sh@365 -- # ver1[v]=1 00:06:51.281 01:22:37 event -- scripts/common.sh@366 -- # decimal 2 00:06:51.281 01:22:37 event -- scripts/common.sh@353 -- # local d=2 00:06:51.281 01:22:37 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:51.281 01:22:37 event -- scripts/common.sh@355 -- # echo 2 00:06:51.281 01:22:37 event -- scripts/common.sh@366 -- # ver2[v]=2 00:06:51.281 01:22:37 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:51.281 01:22:37 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:51.281 01:22:37 event -- scripts/common.sh@368 -- # return 0 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:51.281 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.281 --rc genhtml_branch_coverage=1 00:06:51.281 --rc genhtml_function_coverage=1 00:06:51.281 --rc genhtml_legend=1 00:06:51.281 --rc geninfo_all_blocks=1 00:06:51.281 --rc geninfo_unexecuted_blocks=1 00:06:51.281 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.281 ' 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:51.281 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.281 --rc genhtml_branch_coverage=1 00:06:51.281 --rc genhtml_function_coverage=1 00:06:51.281 --rc genhtml_legend=1 00:06:51.281 --rc geninfo_all_blocks=1 00:06:51.281 --rc geninfo_unexecuted_blocks=1 00:06:51.281 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.281 ' 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:51.281 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.281 --rc genhtml_branch_coverage=1 00:06:51.281 --rc genhtml_function_coverage=1 00:06:51.281 --rc genhtml_legend=1 00:06:51.281 --rc geninfo_all_blocks=1 00:06:51.281 --rc geninfo_unexecuted_blocks=1 00:06:51.281 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.281 ' 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:51.281 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.281 --rc genhtml_branch_coverage=1 00:06:51.281 --rc genhtml_function_coverage=1 00:06:51.281 --rc genhtml_legend=1 00:06:51.281 --rc geninfo_all_blocks=1 00:06:51.281 --rc geninfo_unexecuted_blocks=1 00:06:51.281 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.281 ' 00:06:51.281 01:22:37 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:51.281 01:22:37 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:51.281 01:22:37 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:06:51.281 01:22:37 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.281 01:22:37 event -- common/autotest_common.sh@10 -- # set +x 00:06:51.281 ************************************ 00:06:51.281 START TEST event_perf 00:06:51.281 ************************************ 00:06:51.281 01:22:37 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:51.281 Running I/O for 1 seconds...[2024-12-17 01:22:37.202263] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:51.281 [2024-12-17 01:22:37.202345] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid814108 ] 00:06:51.281 [2024-12-17 01:22:37.268915] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:51.540 [2024-12-17 01:22:37.310214] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:51.540 [2024-12-17 01:22:37.310307] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:06:51.540 [2024-12-17 01:22:37.310373] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:06:51.540 [2024-12-17 01:22:37.310375] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.476 Running I/O for 1 seconds... 00:06:52.476 lcore 0: 196120 00:06:52.476 lcore 1: 196118 00:06:52.476 lcore 2: 196118 00:06:52.476 lcore 3: 196118 00:06:52.476 done. 00:06:52.476 00:06:52.476 real 0m1.176s 00:06:52.476 user 0m4.082s 00:06:52.476 sys 0m0.090s 00:06:52.476 01:22:38 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:52.476 01:22:38 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:52.476 ************************************ 00:06:52.476 END TEST event_perf 00:06:52.476 ************************************ 00:06:52.476 01:22:38 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:52.476 01:22:38 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:06:52.476 01:22:38 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:52.476 01:22:38 event -- common/autotest_common.sh@10 -- # set +x 00:06:52.477 ************************************ 00:06:52.477 START TEST event_reactor 00:06:52.477 ************************************ 00:06:52.477 01:22:38 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:52.477 [2024-12-17 01:22:38.465819] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:52.477 [2024-12-17 01:22:38.465932] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid814359 ] 00:06:52.735 [2024-12-17 01:22:38.537503] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.735 [2024-12-17 01:22:38.572146] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.672 test_start 00:06:53.672 oneshot 00:06:53.672 tick 100 00:06:53.672 tick 100 00:06:53.672 tick 250 00:06:53.672 tick 100 00:06:53.672 tick 100 00:06:53.672 tick 100 00:06:53.672 tick 250 00:06:53.672 tick 500 00:06:53.672 tick 100 00:06:53.672 tick 100 00:06:53.672 tick 250 00:06:53.672 tick 100 00:06:53.672 tick 100 00:06:53.672 test_end 00:06:53.672 00:06:53.672 real 0m1.180s 00:06:53.672 user 0m1.085s 00:06:53.672 sys 0m0.090s 00:06:53.672 01:22:39 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:53.672 01:22:39 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:53.672 ************************************ 00:06:53.672 END TEST event_reactor 00:06:53.672 ************************************ 00:06:53.672 01:22:39 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:53.672 01:22:39 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:06:53.672 01:22:39 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:53.672 01:22:39 event -- common/autotest_common.sh@10 -- # set +x 00:06:53.931 ************************************ 00:06:53.931 START TEST event_reactor_perf 00:06:53.931 ************************************ 00:06:53.931 01:22:39 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:53.931 [2024-12-17 01:22:39.725740] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:53.931 [2024-12-17 01:22:39.725830] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid814639 ] 00:06:53.931 [2024-12-17 01:22:39.796200] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.931 [2024-12-17 01:22:39.834925] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.307 test_start 00:06:55.307 test_end 00:06:55.307 Performance: 913083 events per second 00:06:55.307 00:06:55.307 real 0m1.185s 00:06:55.307 user 0m1.095s 00:06:55.307 sys 0m0.086s 00:06:55.307 01:22:40 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:55.307 01:22:40 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:55.307 ************************************ 00:06:55.307 END TEST event_reactor_perf 00:06:55.307 ************************************ 00:06:55.307 01:22:40 event -- event/event.sh@49 -- # uname -s 00:06:55.307 01:22:40 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:55.307 01:22:40 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:55.307 01:22:40 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:55.307 01:22:40 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:55.307 01:22:40 event -- common/autotest_common.sh@10 -- # set +x 00:06:55.307 ************************************ 00:06:55.307 START TEST event_scheduler 00:06:55.307 ************************************ 00:06:55.307 01:22:40 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:55.307 * Looking for test storage... 00:06:55.307 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1681 -- # lcov --version 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:55.307 01:22:41 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:55.307 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.307 --rc genhtml_branch_coverage=1 00:06:55.307 --rc genhtml_function_coverage=1 00:06:55.307 --rc genhtml_legend=1 00:06:55.307 --rc geninfo_all_blocks=1 00:06:55.307 --rc geninfo_unexecuted_blocks=1 00:06:55.307 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.307 ' 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:55.307 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.307 --rc genhtml_branch_coverage=1 00:06:55.307 --rc genhtml_function_coverage=1 00:06:55.307 --rc genhtml_legend=1 00:06:55.307 --rc geninfo_all_blocks=1 00:06:55.307 --rc geninfo_unexecuted_blocks=1 00:06:55.307 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.307 ' 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:55.307 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.307 --rc genhtml_branch_coverage=1 00:06:55.307 --rc genhtml_function_coverage=1 00:06:55.307 --rc genhtml_legend=1 00:06:55.307 --rc geninfo_all_blocks=1 00:06:55.307 --rc geninfo_unexecuted_blocks=1 00:06:55.307 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.307 ' 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:55.307 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:55.307 --rc genhtml_branch_coverage=1 00:06:55.307 --rc genhtml_function_coverage=1 00:06:55.307 --rc genhtml_legend=1 00:06:55.307 --rc geninfo_all_blocks=1 00:06:55.307 --rc geninfo_unexecuted_blocks=1 00:06:55.307 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:55.307 ' 00:06:55.307 01:22:41 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:55.307 01:22:41 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=814961 00:06:55.307 01:22:41 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:55.307 01:22:41 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:55.307 01:22:41 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 814961 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 814961 ']' 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:55.307 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:55.307 01:22:41 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:55.307 [2024-12-17 01:22:41.183911] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:55.307 [2024-12-17 01:22:41.183971] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid814961 ] 00:06:55.307 [2024-12-17 01:22:41.246870] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:55.307 [2024-12-17 01:22:41.286997] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.307 [2024-12-17 01:22:41.287082] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:55.307 [2024-12-17 01:22:41.287165] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:06:55.307 [2024-12-17 01:22:41.287166] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:06:55.567 01:22:41 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 [2024-12-17 01:22:41.371852] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:55.567 [2024-12-17 01:22:41.371875] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:06:55.567 [2024-12-17 01:22:41.371886] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:55.567 [2024-12-17 01:22:41.371894] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:55.567 [2024-12-17 01:22:41.371901] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 [2024-12-17 01:22:41.439245] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 ************************************ 00:06:55.567 START TEST scheduler_create_thread 00:06:55.567 ************************************ 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 2 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 3 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 4 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 5 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 6 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 7 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.567 8 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.567 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:55.568 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.568 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.568 9 00:06:55.568 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.568 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:55.568 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.568 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:55.826 10 00:06:55.826 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:55.826 01:22:41 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:55.826 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:55.826 01:22:41 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:56.084 01:22:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:56.084 01:22:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:56.084 01:22:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:56.084 01:22:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:56.084 01:22:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.017 01:22:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:57.017 01:22:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:57.017 01:22:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:57.017 01:22:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:57.951 01:22:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:57.951 01:22:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:57.951 01:22:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:57.951 01:22:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:57.951 01:22:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:58.884 01:22:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:58.884 00:06:58.884 real 0m3.231s 00:06:58.884 user 0m0.025s 00:06:58.884 sys 0m0.007s 00:06:58.884 01:22:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:58.884 01:22:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:58.884 ************************************ 00:06:58.884 END TEST scheduler_create_thread 00:06:58.884 ************************************ 00:06:58.884 01:22:44 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:58.884 01:22:44 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 814961 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 814961 ']' 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 814961 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 814961 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 814961' 00:06:58.884 killing process with pid 814961 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 814961 00:06:58.884 01:22:44 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 814961 00:06:59.142 [2024-12-17 01:22:45.088593] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:59.399 00:06:59.399 real 0m4.349s 00:06:59.399 user 0m7.540s 00:06:59.399 sys 0m0.440s 00:06:59.399 01:22:45 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:59.399 01:22:45 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:59.399 ************************************ 00:06:59.399 END TEST event_scheduler 00:06:59.399 ************************************ 00:06:59.399 01:22:45 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:59.399 01:22:45 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:59.399 01:22:45 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:59.399 01:22:45 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:59.399 01:22:45 event -- common/autotest_common.sh@10 -- # set +x 00:06:59.658 ************************************ 00:06:59.658 START TEST app_repeat 00:06:59.658 ************************************ 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@19 -- # repeat_pid=815803 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 815803' 00:06:59.658 Process app_repeat pid: 815803 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:59.658 spdk_app_start Round 0 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@25 -- # waitforlisten 815803 /var/tmp/spdk-nbd.sock 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 815803 ']' 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:59.658 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:59.658 [2024-12-17 01:22:45.435907] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:59.658 [2024-12-17 01:22:45.435989] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid815803 ] 00:06:59.658 [2024-12-17 01:22:45.506345] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:59.658 [2024-12-17 01:22:45.546428] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:59.658 [2024-12-17 01:22:45.546431] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:59.658 01:22:45 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:59.658 01:22:45 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:59.915 Malloc0 00:06:59.916 01:22:45 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:00.173 Malloc1 00:07:00.173 01:22:46 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:00.173 01:22:46 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:00.174 01:22:46 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:00.431 /dev/nbd0 00:07:00.432 01:22:46 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:00.432 01:22:46 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:00.432 1+0 records in 00:07:00.432 1+0 records out 00:07:00.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224747 s, 18.2 MB/s 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:00.432 01:22:46 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:00.432 01:22:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:00.432 01:22:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:00.432 01:22:46 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:00.690 /dev/nbd1 00:07:00.690 01:22:46 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:00.690 01:22:46 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:00.690 1+0 records in 00:07:00.690 1+0 records out 00:07:00.690 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254345 s, 16.1 MB/s 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:00.690 01:22:46 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:00.690 01:22:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:00.690 01:22:46 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:00.690 01:22:46 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:00.690 01:22:46 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.690 01:22:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:00.948 { 00:07:00.948 "nbd_device": "/dev/nbd0", 00:07:00.948 "bdev_name": "Malloc0" 00:07:00.948 }, 00:07:00.948 { 00:07:00.948 "nbd_device": "/dev/nbd1", 00:07:00.948 "bdev_name": "Malloc1" 00:07:00.948 } 00:07:00.948 ]' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:00.948 { 00:07:00.948 "nbd_device": "/dev/nbd0", 00:07:00.948 "bdev_name": "Malloc0" 00:07:00.948 }, 00:07:00.948 { 00:07:00.948 "nbd_device": "/dev/nbd1", 00:07:00.948 "bdev_name": "Malloc1" 00:07:00.948 } 00:07:00.948 ]' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:00.948 /dev/nbd1' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:00.948 /dev/nbd1' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:00.948 256+0 records in 00:07:00.948 256+0 records out 00:07:00.948 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116198 s, 90.2 MB/s 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:00.948 256+0 records in 00:07:00.948 256+0 records out 00:07:00.948 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0193819 s, 54.1 MB/s 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:00.948 256+0 records in 00:07:00.948 256+0 records out 00:07:00.948 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210531 s, 49.8 MB/s 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:00.948 01:22:46 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:01.206 01:22:47 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:01.464 01:22:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:01.722 01:22:47 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:01.722 01:22:47 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:01.722 01:22:47 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:01.980 [2024-12-17 01:22:47.871059] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:01.980 [2024-12-17 01:22:47.905269] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:01.980 [2024-12-17 01:22:47.905271] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.980 [2024-12-17 01:22:47.946280] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:01.980 [2024-12-17 01:22:47.946323] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:05.260 01:22:50 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:05.260 01:22:50 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:05.260 spdk_app_start Round 1 00:07:05.260 01:22:50 event.app_repeat -- event/event.sh@25 -- # waitforlisten 815803 /var/tmp/spdk-nbd.sock 00:07:05.260 01:22:50 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 815803 ']' 00:07:05.260 01:22:50 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:05.260 01:22:50 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:05.260 01:22:50 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:05.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:05.260 01:22:50 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:05.260 01:22:50 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:05.260 01:22:50 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:05.260 01:22:50 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:05.260 01:22:50 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:05.260 Malloc0 00:07:05.260 01:22:51 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:05.518 Malloc1 00:07:05.519 01:22:51 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:05.519 /dev/nbd0 00:07:05.519 01:22:51 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:05.777 01:22:51 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:05.777 1+0 records in 00:07:05.777 1+0 records out 00:07:05.777 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226017 s, 18.1 MB/s 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:05.777 01:22:51 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:05.777 01:22:51 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:05.777 01:22:51 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:05.777 /dev/nbd1 00:07:05.777 01:22:51 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:05.777 01:22:51 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:05.777 1+0 records in 00:07:05.777 1+0 records out 00:07:05.777 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227688 s, 18.0 MB/s 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.777 01:22:51 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:06.035 01:22:51 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:06.035 01:22:51 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:06.035 01:22:51 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:06.035 01:22:51 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:06.035 01:22:51 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:06.035 01:22:51 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:06.035 01:22:51 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.035 01:22:51 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:06.035 01:22:51 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:06.035 { 00:07:06.035 "nbd_device": "/dev/nbd0", 00:07:06.035 "bdev_name": "Malloc0" 00:07:06.035 }, 00:07:06.035 { 00:07:06.035 "nbd_device": "/dev/nbd1", 00:07:06.035 "bdev_name": "Malloc1" 00:07:06.035 } 00:07:06.035 ]' 00:07:06.035 01:22:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:06.035 { 00:07:06.035 "nbd_device": "/dev/nbd0", 00:07:06.035 "bdev_name": "Malloc0" 00:07:06.035 }, 00:07:06.035 { 00:07:06.035 "nbd_device": "/dev/nbd1", 00:07:06.035 "bdev_name": "Malloc1" 00:07:06.035 } 00:07:06.035 ]' 00:07:06.035 01:22:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:06.035 /dev/nbd1' 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:06.035 /dev/nbd1' 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:06.035 01:22:52 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:06.293 256+0 records in 00:07:06.293 256+0 records out 00:07:06.293 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108973 s, 96.2 MB/s 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:06.293 256+0 records in 00:07:06.293 256+0 records out 00:07:06.293 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195798 s, 53.6 MB/s 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:06.293 256+0 records in 00:07:06.293 256+0 records out 00:07:06.293 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0215027 s, 48.8 MB/s 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:06.293 01:22:52 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:06.294 01:22:52 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:06.294 01:22:52 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:06.294 01:22:52 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.294 01:22:52 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:06.294 01:22:52 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:06.294 01:22:52 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:06.294 01:22:52 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:06.294 01:22:52 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:06.551 01:22:52 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:06.552 01:22:52 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:06.552 01:22:52 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:06.552 01:22:52 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:06.552 01:22:52 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:06.552 01:22:52 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.552 01:22:52 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:06.809 01:22:52 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:06.809 01:22:52 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:07.067 01:22:52 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:07.325 [2024-12-17 01:22:53.144152] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:07.325 [2024-12-17 01:22:53.179042] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:07.325 [2024-12-17 01:22:53.179045] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.325 [2024-12-17 01:22:53.220858] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:07.325 [2024-12-17 01:22:53.220903] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:10.606 01:22:55 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:10.606 01:22:55 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:10.606 spdk_app_start Round 2 00:07:10.606 01:22:55 event.app_repeat -- event/event.sh@25 -- # waitforlisten 815803 /var/tmp/spdk-nbd.sock 00:07:10.606 01:22:55 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 815803 ']' 00:07:10.606 01:22:55 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:10.606 01:22:55 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:10.606 01:22:55 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:10.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:10.606 01:22:55 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:10.606 01:22:55 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:10.606 01:22:56 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:10.606 01:22:56 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:10.606 01:22:56 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:10.606 Malloc0 00:07:10.606 01:22:56 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:10.606 Malloc1 00:07:10.606 01:22:56 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:10.606 01:22:56 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:10.864 /dev/nbd0 00:07:10.864 01:22:56 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:10.864 01:22:56 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:10.864 1+0 records in 00:07:10.864 1+0 records out 00:07:10.864 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229437 s, 17.9 MB/s 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:10.864 01:22:56 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:10.864 01:22:56 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:10.864 01:22:56 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:10.864 01:22:56 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:11.122 /dev/nbd1 00:07:11.122 01:22:57 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:11.122 01:22:57 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:11.122 1+0 records in 00:07:11.122 1+0 records out 00:07:11.122 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269245 s, 15.2 MB/s 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:11.122 01:22:57 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:11.122 01:22:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:11.122 01:22:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:11.122 01:22:57 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:11.122 01:22:57 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.122 01:22:57 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:11.380 { 00:07:11.380 "nbd_device": "/dev/nbd0", 00:07:11.380 "bdev_name": "Malloc0" 00:07:11.380 }, 00:07:11.380 { 00:07:11.380 "nbd_device": "/dev/nbd1", 00:07:11.380 "bdev_name": "Malloc1" 00:07:11.380 } 00:07:11.380 ]' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:11.380 { 00:07:11.380 "nbd_device": "/dev/nbd0", 00:07:11.380 "bdev_name": "Malloc0" 00:07:11.380 }, 00:07:11.380 { 00:07:11.380 "nbd_device": "/dev/nbd1", 00:07:11.380 "bdev_name": "Malloc1" 00:07:11.380 } 00:07:11.380 ]' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:11.380 /dev/nbd1' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:11.380 /dev/nbd1' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:11.380 256+0 records in 00:07:11.380 256+0 records out 00:07:11.380 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010575 s, 99.2 MB/s 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:11.380 256+0 records in 00:07:11.380 256+0 records out 00:07:11.380 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197367 s, 53.1 MB/s 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:11.380 256+0 records in 00:07:11.380 256+0 records out 00:07:11.380 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0223625 s, 46.9 MB/s 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:11.380 01:22:57 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:11.381 01:22:57 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:11.638 01:22:57 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.896 01:22:57 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:12.154 01:22:57 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:12.154 01:22:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:12.154 01:22:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:12.154 01:22:58 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:12.154 01:22:58 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:12.412 01:22:58 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:12.412 [2024-12-17 01:22:58.389471] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:12.670 [2024-12-17 01:22:58.424517] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:12.670 [2024-12-17 01:22:58.424520] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.670 [2024-12-17 01:22:58.465497] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:12.670 [2024-12-17 01:22:58.465541] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:15.950 01:23:01 event.app_repeat -- event/event.sh@38 -- # waitforlisten 815803 /var/tmp/spdk-nbd.sock 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 815803 ']' 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:15.950 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:15.950 01:23:01 event.app_repeat -- event/event.sh@39 -- # killprocess 815803 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 815803 ']' 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 815803 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 815803 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 815803' 00:07:15.950 killing process with pid 815803 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@969 -- # kill 815803 00:07:15.950 01:23:01 event.app_repeat -- common/autotest_common.sh@974 -- # wait 815803 00:07:15.951 spdk_app_start is called in Round 0. 00:07:15.951 Shutdown signal received, stop current app iteration 00:07:15.951 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 reinitialization... 00:07:15.951 spdk_app_start is called in Round 1. 00:07:15.951 Shutdown signal received, stop current app iteration 00:07:15.951 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 reinitialization... 00:07:15.951 spdk_app_start is called in Round 2. 00:07:15.951 Shutdown signal received, stop current app iteration 00:07:15.951 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 reinitialization... 00:07:15.951 spdk_app_start is called in Round 3. 00:07:15.951 Shutdown signal received, stop current app iteration 00:07:15.951 01:23:01 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:15.951 01:23:01 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:15.951 00:07:15.951 real 0m16.205s 00:07:15.951 user 0m34.805s 00:07:15.951 sys 0m3.162s 00:07:15.951 01:23:01 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:15.951 01:23:01 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:15.951 ************************************ 00:07:15.951 END TEST app_repeat 00:07:15.951 ************************************ 00:07:15.951 01:23:01 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:15.951 01:23:01 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:15.951 01:23:01 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:15.951 01:23:01 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:15.951 01:23:01 event -- common/autotest_common.sh@10 -- # set +x 00:07:15.951 ************************************ 00:07:15.951 START TEST cpu_locks 00:07:15.951 ************************************ 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:15.951 * Looking for test storage... 00:07:15.951 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1681 -- # lcov --version 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:15.951 01:23:01 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:15.951 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:15.951 --rc genhtml_branch_coverage=1 00:07:15.951 --rc genhtml_function_coverage=1 00:07:15.951 --rc genhtml_legend=1 00:07:15.951 --rc geninfo_all_blocks=1 00:07:15.951 --rc geninfo_unexecuted_blocks=1 00:07:15.951 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:15.951 ' 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:15.951 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:15.951 --rc genhtml_branch_coverage=1 00:07:15.951 --rc genhtml_function_coverage=1 00:07:15.951 --rc genhtml_legend=1 00:07:15.951 --rc geninfo_all_blocks=1 00:07:15.951 --rc geninfo_unexecuted_blocks=1 00:07:15.951 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:15.951 ' 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:15.951 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:15.951 --rc genhtml_branch_coverage=1 00:07:15.951 --rc genhtml_function_coverage=1 00:07:15.951 --rc genhtml_legend=1 00:07:15.951 --rc geninfo_all_blocks=1 00:07:15.951 --rc geninfo_unexecuted_blocks=1 00:07:15.951 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:15.951 ' 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:15.951 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:15.951 --rc genhtml_branch_coverage=1 00:07:15.951 --rc genhtml_function_coverage=1 00:07:15.951 --rc genhtml_legend=1 00:07:15.951 --rc geninfo_all_blocks=1 00:07:15.951 --rc geninfo_unexecuted_blocks=1 00:07:15.951 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:15.951 ' 00:07:15.951 01:23:01 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:15.951 01:23:01 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:15.951 01:23:01 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:15.951 01:23:01 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:15.951 01:23:01 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:15.951 ************************************ 00:07:15.951 START TEST default_locks 00:07:15.951 ************************************ 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- common/autotest_common.sh@1125 -- # default_locks 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=818740 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 818740 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 818740 ']' 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:15.951 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:15.951 01:23:01 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:15.951 [2024-12-17 01:23:01.948657] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:15.951 [2024-12-17 01:23:01.948729] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid818740 ] 00:07:16.210 [2024-12-17 01:23:02.017033] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.210 [2024-12-17 01:23:02.054906] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.468 01:23:02 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:16.468 01:23:02 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 0 00:07:16.468 01:23:02 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 818740 00:07:16.468 01:23:02 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 818740 00:07:16.468 01:23:02 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:17.034 lslocks: write error 00:07:17.034 01:23:02 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 818740 00:07:17.034 01:23:02 event.cpu_locks.default_locks -- common/autotest_common.sh@950 -- # '[' -z 818740 ']' 00:07:17.034 01:23:02 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # kill -0 818740 00:07:17.034 01:23:02 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # uname 00:07:17.034 01:23:02 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:17.034 01:23:02 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 818740 00:07:17.034 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:17.034 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:17.034 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 818740' 00:07:17.034 killing process with pid 818740 00:07:17.034 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@969 -- # kill 818740 00:07:17.034 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@974 -- # wait 818740 00:07:17.600 01:23:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 818740 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@650 -- # local es=0 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 818740 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # waitforlisten 818740 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 818740 ']' 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:17.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:17.601 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (818740) - No such process 00:07:17.601 ERROR: process (pid: 818740) is no longer running 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 1 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # es=1 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:17.601 00:07:17.601 real 0m1.406s 00:07:17.601 user 0m1.405s 00:07:17.601 sys 0m0.709s 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:17.601 01:23:03 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:17.601 ************************************ 00:07:17.601 END TEST default_locks 00:07:17.601 ************************************ 00:07:17.601 01:23:03 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:17.601 01:23:03 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:17.601 01:23:03 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:17.601 01:23:03 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:17.601 ************************************ 00:07:17.601 START TEST default_locks_via_rpc 00:07:17.601 ************************************ 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1125 -- # default_locks_via_rpc 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=819024 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 819024 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 819024 ']' 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:17.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:17.601 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:17.601 [2024-12-17 01:23:03.439574] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:17.601 [2024-12-17 01:23:03.439638] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid819024 ] 00:07:17.601 [2024-12-17 01:23:03.507776] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.601 [2024-12-17 01:23:03.549390] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 819024 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 819024 00:07:17.860 01:23:03 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:18.426 01:23:04 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 819024 00:07:18.426 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@950 -- # '[' -z 819024 ']' 00:07:18.426 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # kill -0 819024 00:07:18.426 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # uname 00:07:18.426 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:18.426 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 819024 00:07:18.685 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:18.685 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:18.685 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 819024' 00:07:18.685 killing process with pid 819024 00:07:18.685 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@969 -- # kill 819024 00:07:18.685 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@974 -- # wait 819024 00:07:18.943 00:07:18.943 real 0m1.355s 00:07:18.943 user 0m1.335s 00:07:18.943 sys 0m0.634s 00:07:18.944 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:18.944 01:23:04 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:18.944 ************************************ 00:07:18.944 END TEST default_locks_via_rpc 00:07:18.944 ************************************ 00:07:18.944 01:23:04 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:18.944 01:23:04 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:18.944 01:23:04 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:18.944 01:23:04 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:18.944 ************************************ 00:07:18.944 START TEST non_locking_app_on_locked_coremask 00:07:18.944 ************************************ 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # non_locking_app_on_locked_coremask 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=819308 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 819308 /var/tmp/spdk.sock 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 819308 ']' 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:18.944 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:18.944 01:23:04 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:18.944 [2024-12-17 01:23:04.877967] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:18.944 [2024-12-17 01:23:04.878030] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid819308 ] 00:07:18.944 [2024-12-17 01:23:04.944782] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.203 [2024-12-17 01:23:04.984190] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=819424 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 819424 /var/tmp/spdk2.sock 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 819424 ']' 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:19.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:19.203 01:23:05 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:19.462 [2024-12-17 01:23:05.209471] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:19.462 [2024-12-17 01:23:05.209534] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid819424 ] 00:07:19.462 [2024-12-17 01:23:05.298488] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:19.462 [2024-12-17 01:23:05.298516] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.462 [2024-12-17 01:23:05.378465] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.394 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:20.394 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:20.394 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 819308 00:07:20.394 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 819308 00:07:20.394 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:20.959 lslocks: write error 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 819308 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 819308 ']' 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 819308 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 819308 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 819308' 00:07:20.959 killing process with pid 819308 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 819308 00:07:20.959 01:23:06 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 819308 00:07:21.526 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 819424 00:07:21.526 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 819424 ']' 00:07:21.526 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 819424 00:07:21.526 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:21.784 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:21.784 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 819424 00:07:21.784 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:21.784 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:21.784 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 819424' 00:07:21.784 killing process with pid 819424 00:07:21.784 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 819424 00:07:21.784 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 819424 00:07:22.042 00:07:22.042 real 0m3.039s 00:07:22.042 user 0m3.162s 00:07:22.042 sys 0m1.167s 00:07:22.042 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:22.042 01:23:07 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:22.042 ************************************ 00:07:22.042 END TEST non_locking_app_on_locked_coremask 00:07:22.042 ************************************ 00:07:22.042 01:23:07 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:22.042 01:23:07 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:22.042 01:23:07 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:22.043 01:23:07 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:22.043 ************************************ 00:07:22.043 START TEST locking_app_on_unlocked_coremask 00:07:22.043 ************************************ 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_unlocked_coremask 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=819885 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 819885 /var/tmp/spdk.sock 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 819885 ']' 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:22.043 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:22.043 01:23:07 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:22.043 [2024-12-17 01:23:07.998571] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:22.043 [2024-12-17 01:23:07.998629] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid819885 ] 00:07:22.301 [2024-12-17 01:23:08.065255] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:22.301 [2024-12-17 01:23:08.065283] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.301 [2024-12-17 01:23:08.101000] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.301 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:22.301 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:22.301 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=820006 00:07:22.301 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 820006 /var/tmp/spdk2.sock 00:07:22.302 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:22.302 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 820006 ']' 00:07:22.302 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:22.302 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:22.302 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:22.302 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:22.302 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:22.302 01:23:08 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:22.560 [2024-12-17 01:23:08.314675] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:22.560 [2024-12-17 01:23:08.314746] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid820006 ] 00:07:22.560 [2024-12-17 01:23:08.404463] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.560 [2024-12-17 01:23:08.482624] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.494 01:23:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:23.494 01:23:09 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:23.494 01:23:09 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 820006 00:07:23.494 01:23:09 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 820006 00:07:23.494 01:23:09 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:24.429 lslocks: write error 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 819885 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 819885 ']' 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 819885 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 819885 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 819885' 00:07:24.429 killing process with pid 819885 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 819885 00:07:24.429 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 819885 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 820006 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 820006 ']' 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 820006 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 820006 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 820006' 00:07:25.116 killing process with pid 820006 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 820006 00:07:25.116 01:23:10 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 820006 00:07:25.393 00:07:25.393 real 0m3.174s 00:07:25.393 user 0m3.353s 00:07:25.393 sys 0m1.185s 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:25.393 ************************************ 00:07:25.393 END TEST locking_app_on_unlocked_coremask 00:07:25.393 ************************************ 00:07:25.393 01:23:11 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:25.393 01:23:11 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:25.393 01:23:11 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:25.393 01:23:11 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:25.393 ************************************ 00:07:25.393 START TEST locking_app_on_locked_coremask 00:07:25.393 ************************************ 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_locked_coremask 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=820502 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 820502 /var/tmp/spdk.sock 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 820502 ']' 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.393 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:25.393 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:25.393 [2024-12-17 01:23:11.245998] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:25.393 [2024-12-17 01:23:11.246057] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid820502 ] 00:07:25.393 [2024-12-17 01:23:11.308144] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.393 [2024-12-17 01:23:11.348744] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=820665 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 820665 /var/tmp/spdk2.sock 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 820665 /var/tmp/spdk2.sock 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # waitforlisten 820665 /var/tmp/spdk2.sock 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 820665 ']' 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:25.652 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:25.652 01:23:11 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:25.652 [2024-12-17 01:23:11.570279] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:25.652 [2024-12-17 01:23:11.570369] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid820665 ] 00:07:25.910 [2024-12-17 01:23:11.658550] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 820502 has claimed it. 00:07:25.910 [2024-12-17 01:23:11.658588] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:26.477 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (820665) - No such process 00:07:26.477 ERROR: process (pid: 820665) is no longer running 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 820502 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 820502 00:07:26.477 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:26.736 lslocks: write error 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 820502 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 820502 ']' 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 820502 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 820502 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 820502' 00:07:26.736 killing process with pid 820502 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 820502 00:07:26.736 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 820502 00:07:26.994 00:07:26.994 real 0m1.626s 00:07:26.994 user 0m1.700s 00:07:26.994 sys 0m0.624s 00:07:26.994 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:26.994 01:23:12 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:26.994 ************************************ 00:07:26.994 END TEST locking_app_on_locked_coremask 00:07:26.994 ************************************ 00:07:26.995 01:23:12 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:26.995 01:23:12 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:26.995 01:23:12 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:26.995 01:23:12 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:26.995 ************************************ 00:07:26.995 START TEST locking_overlapped_coremask 00:07:26.995 ************************************ 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=820905 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 820905 /var/tmp/spdk.sock 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 820905 ']' 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:26.995 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:26.995 01:23:12 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:26.995 [2024-12-17 01:23:12.957720] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:26.995 [2024-12-17 01:23:12.957776] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid820905 ] 00:07:27.253 [2024-12-17 01:23:13.023872] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:27.253 [2024-12-17 01:23:13.065341] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.253 [2024-12-17 01:23:13.065436] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:27.253 [2024-12-17 01:23:13.065437] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=821028 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 821028 /var/tmp/spdk2.sock 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 821028 /var/tmp/spdk2.sock 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # waitforlisten 821028 /var/tmp/spdk2.sock 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 821028 ']' 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:27.512 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:27.512 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:27.512 [2024-12-17 01:23:13.290171] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:27.512 [2024-12-17 01:23:13.290235] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid821028 ] 00:07:27.512 [2024-12-17 01:23:13.380274] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 820905 has claimed it. 00:07:27.512 [2024-12-17 01:23:13.380313] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:28.079 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (821028) - No such process 00:07:28.079 ERROR: process (pid: 821028) is no longer running 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 820905 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@950 -- # '[' -z 820905 ']' 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # kill -0 820905 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # uname 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:28.079 01:23:13 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 820905 00:07:28.079 01:23:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:28.079 01:23:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:28.079 01:23:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 820905' 00:07:28.079 killing process with pid 820905 00:07:28.079 01:23:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@969 -- # kill 820905 00:07:28.079 01:23:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@974 -- # wait 820905 00:07:28.338 00:07:28.338 real 0m1.390s 00:07:28.338 user 0m3.826s 00:07:28.338 sys 0m0.421s 00:07:28.338 01:23:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:28.338 01:23:14 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:28.338 ************************************ 00:07:28.338 END TEST locking_overlapped_coremask 00:07:28.338 ************************************ 00:07:28.596 01:23:14 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:28.596 01:23:14 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:28.596 01:23:14 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:28.596 01:23:14 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:28.596 ************************************ 00:07:28.596 START TEST locking_overlapped_coremask_via_rpc 00:07:28.596 ************************************ 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask_via_rpc 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=821166 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 821166 /var/tmp/spdk.sock 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 821166 ']' 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:28.596 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:28.596 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:28.596 [2024-12-17 01:23:14.430250] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:28.596 [2024-12-17 01:23:14.430309] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid821166 ] 00:07:28.596 [2024-12-17 01:23:14.498245] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:28.596 [2024-12-17 01:23:14.498278] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:28.596 [2024-12-17 01:23:14.536952] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:28.596 [2024-12-17 01:23:14.537046] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:28.597 [2024-12-17 01:23:14.537049] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=821329 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 821329 /var/tmp/spdk2.sock 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 821329 ']' 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:28.855 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:28.855 01:23:14 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:28.855 [2024-12-17 01:23:14.754866] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:28.855 [2024-12-17 01:23:14.754931] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid821329 ] 00:07:28.855 [2024-12-17 01:23:14.848335] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:28.855 [2024-12-17 01:23:14.848369] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:29.114 [2024-12-17 01:23:14.928264] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:07:29.114 [2024-12-17 01:23:14.928309] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:29.114 [2024-12-17 01:23:14.928310] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@650 -- # local es=0 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:29.680 [2024-12-17 01:23:15.635867] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 821166 has claimed it. 00:07:29.680 request: 00:07:29.680 { 00:07:29.680 "method": "framework_enable_cpumask_locks", 00:07:29.680 "req_id": 1 00:07:29.680 } 00:07:29.680 Got JSON-RPC error response 00:07:29.680 response: 00:07:29.680 { 00:07:29.680 "code": -32603, 00:07:29.680 "message": "Failed to claim CPU core: 2" 00:07:29.680 } 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # es=1 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 821166 /var/tmp/spdk.sock 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 821166 ']' 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:29.680 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:29.680 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:29.938 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:29.938 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:29.938 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 821329 /var/tmp/spdk2.sock 00:07:29.938 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 821329 ']' 00:07:29.938 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:29.938 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:29.939 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:29.939 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:29.939 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:29.939 01:23:15 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:30.197 01:23:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:30.197 01:23:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:30.197 01:23:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:30.197 01:23:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:30.197 01:23:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:30.197 01:23:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:30.197 00:07:30.197 real 0m1.639s 00:07:30.197 user 0m0.780s 00:07:30.197 sys 0m0.163s 00:07:30.197 01:23:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:30.197 01:23:16 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:30.197 ************************************ 00:07:30.197 END TEST locking_overlapped_coremask_via_rpc 00:07:30.197 ************************************ 00:07:30.197 01:23:16 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:30.197 01:23:16 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 821166 ]] 00:07:30.197 01:23:16 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 821166 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 821166 ']' 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 821166 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 821166 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 821166' 00:07:30.197 killing process with pid 821166 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 821166 00:07:30.197 01:23:16 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 821166 00:07:30.764 01:23:16 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 821329 ]] 00:07:30.764 01:23:16 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 821329 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 821329 ']' 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 821329 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 821329 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 821329' 00:07:30.764 killing process with pid 821329 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 821329 00:07:30.764 01:23:16 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 821329 00:07:31.023 01:23:16 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:31.023 01:23:16 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:31.023 01:23:16 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 821166 ]] 00:07:31.023 01:23:16 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 821166 00:07:31.023 01:23:16 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 821166 ']' 00:07:31.023 01:23:16 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 821166 00:07:31.023 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (821166) - No such process 00:07:31.023 01:23:16 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 821166 is not found' 00:07:31.023 Process with pid 821166 is not found 00:07:31.023 01:23:16 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 821329 ]] 00:07:31.023 01:23:16 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 821329 00:07:31.023 01:23:16 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 821329 ']' 00:07:31.023 01:23:16 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 821329 00:07:31.023 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (821329) - No such process 00:07:31.023 01:23:16 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 821329 is not found' 00:07:31.023 Process with pid 821329 is not found 00:07:31.023 01:23:16 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:31.023 00:07:31.023 real 0m15.168s 00:07:31.023 user 0m25.389s 00:07:31.023 sys 0m5.981s 00:07:31.023 01:23:16 event.cpu_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:31.023 01:23:16 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:31.023 ************************************ 00:07:31.023 END TEST cpu_locks 00:07:31.023 ************************************ 00:07:31.023 00:07:31.023 real 0m39.945s 00:07:31.023 user 1m14.268s 00:07:31.023 sys 0m10.305s 00:07:31.023 01:23:16 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:31.023 01:23:16 event -- common/autotest_common.sh@10 -- # set +x 00:07:31.023 ************************************ 00:07:31.023 END TEST event 00:07:31.023 ************************************ 00:07:31.023 01:23:16 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:31.023 01:23:16 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:31.023 01:23:16 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:31.023 01:23:16 -- common/autotest_common.sh@10 -- # set +x 00:07:31.023 ************************************ 00:07:31.023 START TEST thread 00:07:31.023 ************************************ 00:07:31.023 01:23:16 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:31.282 * Looking for test storage... 00:07:31.282 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1681 -- # lcov --version 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:31.282 01:23:17 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:31.282 01:23:17 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:31.282 01:23:17 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:31.282 01:23:17 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:31.282 01:23:17 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:31.282 01:23:17 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:31.282 01:23:17 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:31.282 01:23:17 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:31.282 01:23:17 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:31.282 01:23:17 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:31.282 01:23:17 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:31.282 01:23:17 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:31.282 01:23:17 thread -- scripts/common.sh@345 -- # : 1 00:07:31.282 01:23:17 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:31.282 01:23:17 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:31.282 01:23:17 thread -- scripts/common.sh@365 -- # decimal 1 00:07:31.282 01:23:17 thread -- scripts/common.sh@353 -- # local d=1 00:07:31.282 01:23:17 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:31.282 01:23:17 thread -- scripts/common.sh@355 -- # echo 1 00:07:31.282 01:23:17 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:31.282 01:23:17 thread -- scripts/common.sh@366 -- # decimal 2 00:07:31.282 01:23:17 thread -- scripts/common.sh@353 -- # local d=2 00:07:31.282 01:23:17 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:31.282 01:23:17 thread -- scripts/common.sh@355 -- # echo 2 00:07:31.282 01:23:17 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:31.282 01:23:17 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:31.282 01:23:17 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:31.282 01:23:17 thread -- scripts/common.sh@368 -- # return 0 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:31.282 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.282 --rc genhtml_branch_coverage=1 00:07:31.282 --rc genhtml_function_coverage=1 00:07:31.282 --rc genhtml_legend=1 00:07:31.282 --rc geninfo_all_blocks=1 00:07:31.282 --rc geninfo_unexecuted_blocks=1 00:07:31.282 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.282 ' 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:31.282 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.282 --rc genhtml_branch_coverage=1 00:07:31.282 --rc genhtml_function_coverage=1 00:07:31.282 --rc genhtml_legend=1 00:07:31.282 --rc geninfo_all_blocks=1 00:07:31.282 --rc geninfo_unexecuted_blocks=1 00:07:31.282 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.282 ' 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:31.282 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.282 --rc genhtml_branch_coverage=1 00:07:31.282 --rc genhtml_function_coverage=1 00:07:31.282 --rc genhtml_legend=1 00:07:31.282 --rc geninfo_all_blocks=1 00:07:31.282 --rc geninfo_unexecuted_blocks=1 00:07:31.282 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.282 ' 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:31.282 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.282 --rc genhtml_branch_coverage=1 00:07:31.282 --rc genhtml_function_coverage=1 00:07:31.282 --rc genhtml_legend=1 00:07:31.282 --rc geninfo_all_blocks=1 00:07:31.282 --rc geninfo_unexecuted_blocks=1 00:07:31.282 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.282 ' 00:07:31.282 01:23:17 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:31.282 01:23:17 thread -- common/autotest_common.sh@10 -- # set +x 00:07:31.282 ************************************ 00:07:31.282 START TEST thread_poller_perf 00:07:31.282 ************************************ 00:07:31.282 01:23:17 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:31.282 [2024-12-17 01:23:17.224946] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:31.282 [2024-12-17 01:23:17.225032] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid821720 ] 00:07:31.541 [2024-12-17 01:23:17.296691] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.541 [2024-12-17 01:23:17.334994] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.541 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:32.476 [2024-12-17T00:23:18.479Z] ====================================== 00:07:32.476 [2024-12-17T00:23:18.479Z] busy:2503262172 (cyc) 00:07:32.476 [2024-12-17T00:23:18.479Z] total_run_count: 852000 00:07:32.476 [2024-12-17T00:23:18.479Z] tsc_hz: 2500000000 (cyc) 00:07:32.476 [2024-12-17T00:23:18.479Z] ====================================== 00:07:32.476 [2024-12-17T00:23:18.479Z] poller_cost: 2938 (cyc), 1175 (nsec) 00:07:32.476 00:07:32.476 real 0m1.184s 00:07:32.476 user 0m1.089s 00:07:32.476 sys 0m0.090s 00:07:32.476 01:23:18 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:32.476 01:23:18 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:32.476 ************************************ 00:07:32.476 END TEST thread_poller_perf 00:07:32.476 ************************************ 00:07:32.476 01:23:18 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:32.476 01:23:18 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:32.476 01:23:18 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:32.476 01:23:18 thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.476 ************************************ 00:07:32.476 START TEST thread_poller_perf 00:07:32.476 ************************************ 00:07:32.476 01:23:18 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:32.735 [2024-12-17 01:23:18.495324] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:32.735 [2024-12-17 01:23:18.495453] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid822000 ] 00:07:32.735 [2024-12-17 01:23:18.566565] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.735 [2024-12-17 01:23:18.606030] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.735 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:33.670 [2024-12-17T00:23:19.673Z] ====================================== 00:07:33.670 [2024-12-17T00:23:19.673Z] busy:2501399310 (cyc) 00:07:33.670 [2024-12-17T00:23:19.673Z] total_run_count: 13325000 00:07:33.670 [2024-12-17T00:23:19.673Z] tsc_hz: 2500000000 (cyc) 00:07:33.670 [2024-12-17T00:23:19.673Z] ====================================== 00:07:33.670 [2024-12-17T00:23:19.673Z] poller_cost: 187 (cyc), 74 (nsec) 00:07:33.670 00:07:33.670 real 0m1.187s 00:07:33.670 user 0m1.098s 00:07:33.670 sys 0m0.084s 00:07:33.670 01:23:19 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:33.670 01:23:19 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:33.670 ************************************ 00:07:33.670 END TEST thread_poller_perf 00:07:33.670 ************************************ 00:07:33.929 01:23:19 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:33.929 01:23:19 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:33.929 01:23:19 thread -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:33.929 01:23:19 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:33.929 01:23:19 thread -- common/autotest_common.sh@10 -- # set +x 00:07:33.929 ************************************ 00:07:33.929 START TEST thread_spdk_lock 00:07:33.929 ************************************ 00:07:33.929 01:23:19 thread.thread_spdk_lock -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:33.929 [2024-12-17 01:23:19.763462] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:33.929 [2024-12-17 01:23:19.763550] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid822280 ] 00:07:33.929 [2024-12-17 01:23:19.832670] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:33.929 [2024-12-17 01:23:19.873574] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:33.929 [2024-12-17 01:23:19.873575] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.495 [2024-12-17 01:23:20.355871] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 967:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:34.495 [2024-12-17 01:23:20.355913] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3080:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:34.495 [2024-12-17 01:23:20.355924] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3035:sspin_stacks_print: *ERROR*: spinlock 0x130cd00 00:07:34.496 [2024-12-17 01:23:20.356846] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 862:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:34.496 [2024-12-17 01:23:20.356950] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1028:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:34.496 [2024-12-17 01:23:20.356969] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 862:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:34.496 Starting test contend 00:07:34.496 Worker Delay Wait us Hold us Total us 00:07:34.496 0 3 158448 180735 339184 00:07:34.496 1 5 74096 281643 355740 00:07:34.496 PASS test contend 00:07:34.496 Starting test hold_by_poller 00:07:34.496 PASS test hold_by_poller 00:07:34.496 Starting test hold_by_message 00:07:34.496 PASS test hold_by_message 00:07:34.496 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:34.496 100014 assertions passed 00:07:34.496 0 assertions failed 00:07:34.496 00:07:34.496 real 0m0.667s 00:07:34.496 user 0m1.059s 00:07:34.496 sys 0m0.088s 00:07:34.496 01:23:20 thread.thread_spdk_lock -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:34.496 01:23:20 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:34.496 ************************************ 00:07:34.496 END TEST thread_spdk_lock 00:07:34.496 ************************************ 00:07:34.496 00:07:34.496 real 0m3.476s 00:07:34.496 user 0m3.429s 00:07:34.496 sys 0m0.549s 00:07:34.496 01:23:20 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:34.496 01:23:20 thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.496 ************************************ 00:07:34.496 END TEST thread 00:07:34.496 ************************************ 00:07:34.496 01:23:20 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:34.496 01:23:20 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:34.496 01:23:20 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:34.496 01:23:20 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:34.496 01:23:20 -- common/autotest_common.sh@10 -- # set +x 00:07:34.754 ************************************ 00:07:34.754 START TEST app_cmdline 00:07:34.754 ************************************ 00:07:34.754 01:23:20 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:34.754 * Looking for test storage... 00:07:34.754 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:34.754 01:23:20 app_cmdline -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:34.754 01:23:20 app_cmdline -- common/autotest_common.sh@1681 -- # lcov --version 00:07:34.754 01:23:20 app_cmdline -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:34.754 01:23:20 app_cmdline -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:34.754 01:23:20 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:34.754 01:23:20 app_cmdline -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:34.754 01:23:20 app_cmdline -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:34.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.754 --rc genhtml_branch_coverage=1 00:07:34.754 --rc genhtml_function_coverage=1 00:07:34.755 --rc genhtml_legend=1 00:07:34.755 --rc geninfo_all_blocks=1 00:07:34.755 --rc geninfo_unexecuted_blocks=1 00:07:34.755 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.755 ' 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:34.755 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.755 --rc genhtml_branch_coverage=1 00:07:34.755 --rc genhtml_function_coverage=1 00:07:34.755 --rc genhtml_legend=1 00:07:34.755 --rc geninfo_all_blocks=1 00:07:34.755 --rc geninfo_unexecuted_blocks=1 00:07:34.755 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.755 ' 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:34.755 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.755 --rc genhtml_branch_coverage=1 00:07:34.755 --rc genhtml_function_coverage=1 00:07:34.755 --rc genhtml_legend=1 00:07:34.755 --rc geninfo_all_blocks=1 00:07:34.755 --rc geninfo_unexecuted_blocks=1 00:07:34.755 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.755 ' 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:34.755 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:34.755 --rc genhtml_branch_coverage=1 00:07:34.755 --rc genhtml_function_coverage=1 00:07:34.755 --rc genhtml_legend=1 00:07:34.755 --rc geninfo_all_blocks=1 00:07:34.755 --rc geninfo_unexecuted_blocks=1 00:07:34.755 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:34.755 ' 00:07:34.755 01:23:20 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:34.755 01:23:20 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=822540 00:07:34.755 01:23:20 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:34.755 01:23:20 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 822540 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 822540 ']' 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:34.755 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:34.755 01:23:20 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:35.014 [2024-12-17 01:23:20.762087] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:35.014 [2024-12-17 01:23:20.762151] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid822540 ] 00:07:35.014 [2024-12-17 01:23:20.828451] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.014 [2024-12-17 01:23:20.866547] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.272 01:23:21 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:35.272 01:23:21 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:07:35.272 01:23:21 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:35.272 { 00:07:35.272 "version": "SPDK v24.09.1-pre git sha1 b18e1bd62", 00:07:35.272 "fields": { 00:07:35.272 "major": 24, 00:07:35.272 "minor": 9, 00:07:35.272 "patch": 1, 00:07:35.272 "suffix": "-pre", 00:07:35.272 "commit": "b18e1bd62" 00:07:35.272 } 00:07:35.272 } 00:07:35.272 01:23:21 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:35.272 01:23:21 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:35.272 01:23:21 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:35.272 01:23:21 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:35.272 01:23:21 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:35.272 01:23:21 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:35.272 01:23:21 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:35.272 01:23:21 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:35.272 01:23:21 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:35.272 01:23:21 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:35.531 01:23:21 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:35.531 01:23:21 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:35.531 01:23:21 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:35.531 request: 00:07:35.531 { 00:07:35.531 "method": "env_dpdk_get_mem_stats", 00:07:35.531 "req_id": 1 00:07:35.531 } 00:07:35.531 Got JSON-RPC error response 00:07:35.531 response: 00:07:35.531 { 00:07:35.531 "code": -32601, 00:07:35.531 "message": "Method not found" 00:07:35.531 } 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:35.531 01:23:21 app_cmdline -- app/cmdline.sh@1 -- # killprocess 822540 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 822540 ']' 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 822540 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:35.531 01:23:21 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 822540 00:07:35.789 01:23:21 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:35.789 01:23:21 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:35.789 01:23:21 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 822540' 00:07:35.789 killing process with pid 822540 00:07:35.789 01:23:21 app_cmdline -- common/autotest_common.sh@969 -- # kill 822540 00:07:35.789 01:23:21 app_cmdline -- common/autotest_common.sh@974 -- # wait 822540 00:07:36.047 00:07:36.047 real 0m1.306s 00:07:36.047 user 0m1.479s 00:07:36.047 sys 0m0.502s 00:07:36.047 01:23:21 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:36.047 01:23:21 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:36.047 ************************************ 00:07:36.047 END TEST app_cmdline 00:07:36.047 ************************************ 00:07:36.047 01:23:21 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:36.047 01:23:21 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:36.047 01:23:21 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:36.047 01:23:21 -- common/autotest_common.sh@10 -- # set +x 00:07:36.047 ************************************ 00:07:36.047 START TEST version 00:07:36.047 ************************************ 00:07:36.047 01:23:21 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:36.047 * Looking for test storage... 00:07:36.047 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:36.047 01:23:22 version -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:36.047 01:23:22 version -- common/autotest_common.sh@1681 -- # lcov --version 00:07:36.047 01:23:22 version -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:36.305 01:23:22 version -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:36.305 01:23:22 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:36.305 01:23:22 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:36.305 01:23:22 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:36.305 01:23:22 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:36.305 01:23:22 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:36.305 01:23:22 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:36.305 01:23:22 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:36.305 01:23:22 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:36.305 01:23:22 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:36.305 01:23:22 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:36.305 01:23:22 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:36.305 01:23:22 version -- scripts/common.sh@344 -- # case "$op" in 00:07:36.305 01:23:22 version -- scripts/common.sh@345 -- # : 1 00:07:36.305 01:23:22 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:36.305 01:23:22 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:36.305 01:23:22 version -- scripts/common.sh@365 -- # decimal 1 00:07:36.305 01:23:22 version -- scripts/common.sh@353 -- # local d=1 00:07:36.305 01:23:22 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:36.305 01:23:22 version -- scripts/common.sh@355 -- # echo 1 00:07:36.305 01:23:22 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:36.305 01:23:22 version -- scripts/common.sh@366 -- # decimal 2 00:07:36.305 01:23:22 version -- scripts/common.sh@353 -- # local d=2 00:07:36.306 01:23:22 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:36.306 01:23:22 version -- scripts/common.sh@355 -- # echo 2 00:07:36.306 01:23:22 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:36.306 01:23:22 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:36.306 01:23:22 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:36.306 01:23:22 version -- scripts/common.sh@368 -- # return 0 00:07:36.306 01:23:22 version -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:36.306 01:23:22 version -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:36.306 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.306 --rc genhtml_branch_coverage=1 00:07:36.306 --rc genhtml_function_coverage=1 00:07:36.306 --rc genhtml_legend=1 00:07:36.306 --rc geninfo_all_blocks=1 00:07:36.306 --rc geninfo_unexecuted_blocks=1 00:07:36.306 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.306 ' 00:07:36.306 01:23:22 version -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:36.306 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.306 --rc genhtml_branch_coverage=1 00:07:36.306 --rc genhtml_function_coverage=1 00:07:36.306 --rc genhtml_legend=1 00:07:36.306 --rc geninfo_all_blocks=1 00:07:36.306 --rc geninfo_unexecuted_blocks=1 00:07:36.306 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.306 ' 00:07:36.306 01:23:22 version -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:36.306 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.306 --rc genhtml_branch_coverage=1 00:07:36.306 --rc genhtml_function_coverage=1 00:07:36.306 --rc genhtml_legend=1 00:07:36.306 --rc geninfo_all_blocks=1 00:07:36.306 --rc geninfo_unexecuted_blocks=1 00:07:36.306 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.306 ' 00:07:36.306 01:23:22 version -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:36.306 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.306 --rc genhtml_branch_coverage=1 00:07:36.306 --rc genhtml_function_coverage=1 00:07:36.306 --rc genhtml_legend=1 00:07:36.306 --rc geninfo_all_blocks=1 00:07:36.306 --rc geninfo_unexecuted_blocks=1 00:07:36.306 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.306 ' 00:07:36.306 01:23:22 version -- app/version.sh@17 -- # get_header_version major 00:07:36.306 01:23:22 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:36.306 01:23:22 version -- app/version.sh@14 -- # cut -f2 00:07:36.306 01:23:22 version -- app/version.sh@14 -- # tr -d '"' 00:07:36.306 01:23:22 version -- app/version.sh@17 -- # major=24 00:07:36.306 01:23:22 version -- app/version.sh@18 -- # get_header_version minor 00:07:36.306 01:23:22 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:36.306 01:23:22 version -- app/version.sh@14 -- # cut -f2 00:07:36.306 01:23:22 version -- app/version.sh@14 -- # tr -d '"' 00:07:36.306 01:23:22 version -- app/version.sh@18 -- # minor=9 00:07:36.306 01:23:22 version -- app/version.sh@19 -- # get_header_version patch 00:07:36.306 01:23:22 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:36.306 01:23:22 version -- app/version.sh@14 -- # cut -f2 00:07:36.306 01:23:22 version -- app/version.sh@14 -- # tr -d '"' 00:07:36.306 01:23:22 version -- app/version.sh@19 -- # patch=1 00:07:36.306 01:23:22 version -- app/version.sh@20 -- # get_header_version suffix 00:07:36.306 01:23:22 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:36.306 01:23:22 version -- app/version.sh@14 -- # cut -f2 00:07:36.306 01:23:22 version -- app/version.sh@14 -- # tr -d '"' 00:07:36.306 01:23:22 version -- app/version.sh@20 -- # suffix=-pre 00:07:36.306 01:23:22 version -- app/version.sh@22 -- # version=24.9 00:07:36.306 01:23:22 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:36.306 01:23:22 version -- app/version.sh@25 -- # version=24.9.1 00:07:36.306 01:23:22 version -- app/version.sh@28 -- # version=24.9.1rc0 00:07:36.306 01:23:22 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:36.306 01:23:22 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:36.306 01:23:22 version -- app/version.sh@30 -- # py_version=24.9.1rc0 00:07:36.306 01:23:22 version -- app/version.sh@31 -- # [[ 24.9.1rc0 == \2\4\.\9\.\1\r\c\0 ]] 00:07:36.306 00:07:36.306 real 0m0.271s 00:07:36.306 user 0m0.148s 00:07:36.306 sys 0m0.178s 00:07:36.306 01:23:22 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:36.306 01:23:22 version -- common/autotest_common.sh@10 -- # set +x 00:07:36.306 ************************************ 00:07:36.306 END TEST version 00:07:36.306 ************************************ 00:07:36.306 01:23:22 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:36.306 01:23:22 -- spdk/autotest.sh@194 -- # uname -s 00:07:36.306 01:23:22 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:36.306 01:23:22 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:36.306 01:23:22 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:36.306 01:23:22 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@256 -- # timing_exit lib 00:07:36.306 01:23:22 -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:36.306 01:23:22 -- common/autotest_common.sh@10 -- # set +x 00:07:36.306 01:23:22 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@263 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@272 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@307 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@351 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:36.306 01:23:22 -- spdk/autotest.sh@362 -- # [[ 0 -eq 1 ]] 00:07:36.306 01:23:22 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:36.306 01:23:22 -- spdk/autotest.sh@370 -- # [[ 1 -eq 1 ]] 00:07:36.306 01:23:22 -- spdk/autotest.sh@371 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:36.306 01:23:22 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:36.306 01:23:22 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:36.306 01:23:22 -- common/autotest_common.sh@10 -- # set +x 00:07:36.565 ************************************ 00:07:36.565 START TEST llvm_fuzz 00:07:36.565 ************************************ 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:36.565 * Looking for test storage... 00:07:36.565 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:36.565 01:23:22 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:36.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.565 --rc genhtml_branch_coverage=1 00:07:36.565 --rc genhtml_function_coverage=1 00:07:36.565 --rc genhtml_legend=1 00:07:36.565 --rc geninfo_all_blocks=1 00:07:36.565 --rc geninfo_unexecuted_blocks=1 00:07:36.565 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.565 ' 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:36.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.565 --rc genhtml_branch_coverage=1 00:07:36.565 --rc genhtml_function_coverage=1 00:07:36.565 --rc genhtml_legend=1 00:07:36.565 --rc geninfo_all_blocks=1 00:07:36.565 --rc geninfo_unexecuted_blocks=1 00:07:36.565 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.565 ' 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:36.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.565 --rc genhtml_branch_coverage=1 00:07:36.565 --rc genhtml_function_coverage=1 00:07:36.565 --rc genhtml_legend=1 00:07:36.565 --rc geninfo_all_blocks=1 00:07:36.565 --rc geninfo_unexecuted_blocks=1 00:07:36.565 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.565 ' 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:36.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.565 --rc genhtml_branch_coverage=1 00:07:36.565 --rc genhtml_function_coverage=1 00:07:36.565 --rc genhtml_legend=1 00:07:36.565 --rc geninfo_all_blocks=1 00:07:36.565 --rc geninfo_unexecuted_blocks=1 00:07:36.565 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.565 ' 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@548 -- # local fuzzers 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:36.565 01:23:22 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:36.565 01:23:22 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:36.826 ************************************ 00:07:36.826 START TEST nvmf_llvm_fuzz 00:07:36.826 ************************************ 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:36.826 * Looking for test storage... 00:07:36.826 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:36.826 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.826 --rc genhtml_branch_coverage=1 00:07:36.826 --rc genhtml_function_coverage=1 00:07:36.826 --rc genhtml_legend=1 00:07:36.826 --rc geninfo_all_blocks=1 00:07:36.826 --rc geninfo_unexecuted_blocks=1 00:07:36.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.826 ' 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:36.826 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.826 --rc genhtml_branch_coverage=1 00:07:36.826 --rc genhtml_function_coverage=1 00:07:36.826 --rc genhtml_legend=1 00:07:36.826 --rc geninfo_all_blocks=1 00:07:36.826 --rc geninfo_unexecuted_blocks=1 00:07:36.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.826 ' 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:36.826 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.826 --rc genhtml_branch_coverage=1 00:07:36.826 --rc genhtml_function_coverage=1 00:07:36.826 --rc genhtml_legend=1 00:07:36.826 --rc geninfo_all_blocks=1 00:07:36.826 --rc geninfo_unexecuted_blocks=1 00:07:36.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.826 ' 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:36.826 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.826 --rc genhtml_branch_coverage=1 00:07:36.826 --rc genhtml_function_coverage=1 00:07:36.826 --rc genhtml_legend=1 00:07:36.826 --rc geninfo_all_blocks=1 00:07:36.826 --rc geninfo_unexecuted_blocks=1 00:07:36.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:36.826 ' 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:36.826 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_SHARED=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_FC=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_URING=n 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:36.827 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:36.827 #define SPDK_CONFIG_H 00:07:36.827 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:36.827 #define SPDK_CONFIG_APPS 1 00:07:36.827 #define SPDK_CONFIG_ARCH native 00:07:36.827 #undef SPDK_CONFIG_ASAN 00:07:36.827 #undef SPDK_CONFIG_AVAHI 00:07:36.827 #undef SPDK_CONFIG_CET 00:07:36.827 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:36.827 #define SPDK_CONFIG_COVERAGE 1 00:07:36.827 #define SPDK_CONFIG_CROSS_PREFIX 00:07:36.828 #undef SPDK_CONFIG_CRYPTO 00:07:36.828 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:36.828 #undef SPDK_CONFIG_CUSTOMOCF 00:07:36.828 #undef SPDK_CONFIG_DAOS 00:07:36.828 #define SPDK_CONFIG_DAOS_DIR 00:07:36.828 #define SPDK_CONFIG_DEBUG 1 00:07:36.828 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:36.828 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:36.828 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:36.828 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:36.828 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:36.828 #undef SPDK_CONFIG_DPDK_UADK 00:07:36.828 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:36.828 #define SPDK_CONFIG_EXAMPLES 1 00:07:36.828 #undef SPDK_CONFIG_FC 00:07:36.828 #define SPDK_CONFIG_FC_PATH 00:07:36.828 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:36.828 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:36.828 #define SPDK_CONFIG_FSDEV 1 00:07:36.828 #undef SPDK_CONFIG_FUSE 00:07:36.828 #define SPDK_CONFIG_FUZZER 1 00:07:36.828 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:36.828 #undef SPDK_CONFIG_GOLANG 00:07:36.828 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:36.828 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:36.828 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:36.828 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:36.828 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:36.828 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:36.828 #undef SPDK_CONFIG_HAVE_LZ4 00:07:36.828 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:36.828 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:36.828 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:36.828 #define SPDK_CONFIG_IDXD 1 00:07:36.828 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:36.828 #undef SPDK_CONFIG_IPSEC_MB 00:07:36.828 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:36.828 #define SPDK_CONFIG_ISAL 1 00:07:36.828 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:36.828 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:36.828 #define SPDK_CONFIG_LIBDIR 00:07:36.828 #undef SPDK_CONFIG_LTO 00:07:36.828 #define SPDK_CONFIG_MAX_LCORES 128 00:07:36.828 #define SPDK_CONFIG_NVME_CUSE 1 00:07:36.828 #undef SPDK_CONFIG_OCF 00:07:36.828 #define SPDK_CONFIG_OCF_PATH 00:07:36.828 #define SPDK_CONFIG_OPENSSL_PATH 00:07:36.828 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:36.828 #define SPDK_CONFIG_PGO_DIR 00:07:36.828 #undef SPDK_CONFIG_PGO_USE 00:07:36.828 #define SPDK_CONFIG_PREFIX /usr/local 00:07:36.828 #undef SPDK_CONFIG_RAID5F 00:07:36.828 #undef SPDK_CONFIG_RBD 00:07:36.828 #define SPDK_CONFIG_RDMA 1 00:07:36.828 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:36.828 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:36.828 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:36.828 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:36.828 #undef SPDK_CONFIG_SHARED 00:07:36.828 #undef SPDK_CONFIG_SMA 00:07:36.828 #define SPDK_CONFIG_TESTS 1 00:07:36.828 #undef SPDK_CONFIG_TSAN 00:07:36.828 #define SPDK_CONFIG_UBLK 1 00:07:36.828 #define SPDK_CONFIG_UBSAN 1 00:07:36.828 #undef SPDK_CONFIG_UNIT_TESTS 00:07:36.828 #undef SPDK_CONFIG_URING 00:07:36.828 #define SPDK_CONFIG_URING_PATH 00:07:36.828 #undef SPDK_CONFIG_URING_ZNS 00:07:36.828 #undef SPDK_CONFIG_USDT 00:07:36.828 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:36.828 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:36.828 #define SPDK_CONFIG_VFIO_USER 1 00:07:36.828 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:36.828 #define SPDK_CONFIG_VHOST 1 00:07:36.828 #define SPDK_CONFIG_VIRTIO 1 00:07:36.828 #undef SPDK_CONFIG_VTUNE 00:07:36.828 #define SPDK_CONFIG_VTUNE_DIR 00:07:36.828 #define SPDK_CONFIG_WERROR 1 00:07:36.828 #define SPDK_CONFIG_WPDK_DIR 00:07:36.828 #undef SPDK_CONFIG_XNVME 00:07:36.828 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:36.828 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:37.090 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 823053 ]] 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 823053 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:07:37.091 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.fz7huZ 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.fz7huZ/tests/nvmf /tmp/spdk.fz7huZ 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=785162240 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4499267584 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=52906688512 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730607104 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=8823918592 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30861873152 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865301504 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=3428352 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340121600 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=6000640 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30864969728 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=335872 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173044736 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173057024 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:07:37.092 * Looking for test storage... 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=52906688512 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=11038511104 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.092 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1668 -- # set -o errtrace 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1673 -- # true 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1675 -- # xtrace_fd 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:37.092 01:23:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:37.092 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:37.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.093 --rc genhtml_branch_coverage=1 00:07:37.093 --rc genhtml_function_coverage=1 00:07:37.093 --rc genhtml_legend=1 00:07:37.093 --rc geninfo_all_blocks=1 00:07:37.093 --rc geninfo_unexecuted_blocks=1 00:07:37.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.093 ' 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:37.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.093 --rc genhtml_branch_coverage=1 00:07:37.093 --rc genhtml_function_coverage=1 00:07:37.093 --rc genhtml_legend=1 00:07:37.093 --rc geninfo_all_blocks=1 00:07:37.093 --rc geninfo_unexecuted_blocks=1 00:07:37.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.093 ' 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:37.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.093 --rc genhtml_branch_coverage=1 00:07:37.093 --rc genhtml_function_coverage=1 00:07:37.093 --rc genhtml_legend=1 00:07:37.093 --rc geninfo_all_blocks=1 00:07:37.093 --rc geninfo_unexecuted_blocks=1 00:07:37.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.093 ' 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:37.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:37.093 --rc genhtml_branch_coverage=1 00:07:37.093 --rc genhtml_function_coverage=1 00:07:37.093 --rc genhtml_legend=1 00:07:37.093 --rc geninfo_all_blocks=1 00:07:37.093 --rc geninfo_unexecuted_blocks=1 00:07:37.093 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:37.093 ' 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:37.093 01:23:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:37.093 [2024-12-17 01:23:23.081405] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:37.093 [2024-12-17 01:23:23.081480] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid823121 ] 00:07:37.352 [2024-12-17 01:23:23.340538] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.611 [2024-12-17 01:23:23.372122] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.611 [2024-12-17 01:23:23.424398] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.611 [2024-12-17 01:23:23.440781] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:37.611 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.611 INFO: Seed: 20096905 00:07:37.611 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:37.611 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:37.611 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:37.611 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.611 #2 INITED exec/s: 0 rss: 64Mb 00:07:37.611 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.611 This may also happen if the target rejected all inputs we tried so far 00:07:37.611 [2024-12-17 01:23:23.496034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.611 [2024-12-17 01:23:23.496067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.869 NEW_FUNC[1/713]: 0x452788 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:37.869 NEW_FUNC[2/713]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:37.869 #19 NEW cov: 12131 ft: 12128 corp: 2/79b lim: 320 exec/s: 0 rss: 72Mb L: 78/78 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:37.869 [2024-12-17 01:23:23.826993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.869 [2024-12-17 01:23:23.827026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.869 [2024-12-17 01:23:23.827085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:37.869 [2024-12-17 01:23:23.827099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.869 NEW_FUNC[1/2]: 0x150df38 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2213 00:07:38.127 NEW_FUNC[2/2]: 0x19489f8 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:07:38.128 #25 NEW cov: 12298 ft: 12900 corp: 3/233b lim: 320 exec/s: 0 rss: 72Mb L: 154/154 MS: 1 InsertRepeatedBytes- 00:07:38.128 [2024-12-17 01:23:23.887082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.128 [2024-12-17 01:23:23.887109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.128 [2024-12-17 01:23:23.887168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.128 [2024-12-17 01:23:23.887182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.128 #26 NEW cov: 12304 ft: 13033 corp: 4/387b lim: 320 exec/s: 0 rss: 73Mb L: 154/154 MS: 1 ShuffleBytes- 00:07:38.128 [2024-12-17 01:23:23.947145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.128 [2024-12-17 01:23:23.947173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.128 #27 NEW cov: 12389 ft: 13363 corp: 5/465b lim: 320 exec/s: 0 rss: 73Mb L: 78/154 MS: 1 CMP- DE: "\004\000\000\000\000\000\000\000"- 00:07:38.128 [2024-12-17 01:23:23.987267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:a2a2a2a2 00:07:38.128 [2024-12-17 01:23:23.987292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.128 #28 NEW cov: 12389 ft: 13602 corp: 6/547b lim: 320 exec/s: 0 rss: 73Mb L: 82/154 MS: 1 InsertRepeatedBytes- 00:07:38.128 [2024-12-17 01:23:24.027485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:400 cdw10:00000000 cdw11:00000000 00:07:38.128 [2024-12-17 01:23:24.027511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.128 [2024-12-17 01:23:24.027567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.128 [2024-12-17 01:23:24.027580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.128 #29 NEW cov: 12389 ft: 13649 corp: 7/701b lim: 320 exec/s: 0 rss: 73Mb L: 154/154 MS: 1 ChangeBit- 00:07:38.128 [2024-12-17 01:23:24.067529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.128 [2024-12-17 01:23:24.067554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.128 [2024-12-17 01:23:24.067613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.128 [2024-12-17 01:23:24.067627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.128 #30 NEW cov: 12389 ft: 13696 corp: 8/855b lim: 320 exec/s: 0 rss: 73Mb L: 154/154 MS: 1 ChangeBit- 00:07:38.128 [2024-12-17 01:23:24.107609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.128 [2024-12-17 01:23:24.107634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.128 #31 NEW cov: 12389 ft: 13739 corp: 9/933b lim: 320 exec/s: 0 rss: 73Mb L: 78/154 MS: 1 CopyPart- 00:07:38.386 [2024-12-17 01:23:24.147708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.386 [2024-12-17 01:23:24.147733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.386 #37 NEW cov: 12389 ft: 13800 corp: 10/1011b lim: 320 exec/s: 0 rss: 73Mb L: 78/154 MS: 1 ChangeByte- 00:07:38.386 [2024-12-17 01:23:24.187908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.386 [2024-12-17 01:23:24.187933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.386 [2024-12-17 01:23:24.187992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.386 [2024-12-17 01:23:24.188005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.386 #38 NEW cov: 12389 ft: 13839 corp: 11/1165b lim: 320 exec/s: 0 rss: 73Mb L: 154/154 MS: 1 ChangeByte- 00:07:38.386 [2024-12-17 01:23:24.248012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b000000 cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff00041b 00:07:38.386 [2024-12-17 01:23:24.248038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.386 #43 NEW cov: 12390 ft: 13869 corp: 12/1241b lim: 320 exec/s: 0 rss: 73Mb L: 76/154 MS: 5 ChangeBit-InsertRepeatedBytes-PersAutoDict-CopyPart-CrossOver- DE: "\004\000\000\000\000\000\000\000"- 00:07:38.386 [2024-12-17 01:23:24.288187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:400 cdw10:00000000 cdw11:00000000 00:07:38.386 [2024-12-17 01:23:24.288212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.386 [2024-12-17 01:23:24.288272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.386 [2024-12-17 01:23:24.288286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.386 #44 NEW cov: 12390 ft: 13941 corp: 13/1396b lim: 320 exec/s: 0 rss: 73Mb L: 155/155 MS: 1 InsertByte- 00:07:38.386 [2024-12-17 01:23:24.348301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.386 [2024-12-17 01:23:24.348326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.645 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:38.645 #45 NEW cov: 12413 ft: 14016 corp: 14/1474b lim: 320 exec/s: 0 rss: 73Mb L: 78/155 MS: 1 ChangeByte- 00:07:38.645 [2024-12-17 01:23:24.408480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.645 [2024-12-17 01:23:24.408506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.645 #46 NEW cov: 12413 ft: 14053 corp: 15/1552b lim: 320 exec/s: 0 rss: 73Mb L: 78/155 MS: 1 ShuffleBytes- 00:07:38.645 [2024-12-17 01:23:24.468701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.645 [2024-12-17 01:23:24.468726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.645 [2024-12-17 01:23:24.468784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.645 [2024-12-17 01:23:24.468807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.645 #47 NEW cov: 12413 ft: 14112 corp: 16/1714b lim: 320 exec/s: 47 rss: 73Mb L: 162/162 MS: 1 CMP- DE: "\033\033\242\340I\311\005\000"- 00:07:38.645 [2024-12-17 01:23:24.508800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.645 [2024-12-17 01:23:24.508826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.645 [2024-12-17 01:23:24.508883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.645 [2024-12-17 01:23:24.508897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.645 #48 NEW cov: 12413 ft: 14138 corp: 17/1876b lim: 320 exec/s: 48 rss: 73Mb L: 162/162 MS: 1 CMP- DE: "\377\004\311O#\233\021l"- 00:07:38.645 [2024-12-17 01:23:24.568907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b000000 cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff00041b 00:07:38.645 [2024-12-17 01:23:24.568932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.645 #49 NEW cov: 12413 ft: 14161 corp: 18/1948b lim: 320 exec/s: 49 rss: 73Mb L: 72/162 MS: 1 EraseBytes- 00:07:38.645 [2024-12-17 01:23:24.629151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.645 [2024-12-17 01:23:24.629176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.645 [2024-12-17 01:23:24.629234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:38.645 [2024-12-17 01:23:24.629249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.903 #50 NEW cov: 12413 ft: 14209 corp: 19/2110b lim: 320 exec/s: 50 rss: 73Mb L: 162/162 MS: 1 ChangeByte- 00:07:38.903 [2024-12-17 01:23:24.689238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b000000 cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff00041b 00:07:38.903 [2024-12-17 01:23:24.689263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.903 #51 NEW cov: 12413 ft: 14213 corp: 20/2186b lim: 320 exec/s: 51 rss: 73Mb L: 76/162 MS: 1 ChangeByte- 00:07:38.903 [2024-12-17 01:23:24.729307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.903 [2024-12-17 01:23:24.729332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.903 #52 NEW cov: 12413 ft: 14240 corp: 21/2264b lim: 320 exec/s: 52 rss: 73Mb L: 78/162 MS: 1 CMP- DE: "\377\377\377\000"- 00:07:38.903 [2024-12-17 01:23:24.769485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00b10000 00:07:38.903 [2024-12-17 01:23:24.769511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.903 #53 NEW cov: 12413 ft: 14305 corp: 22/2343b lim: 320 exec/s: 53 rss: 74Mb L: 79/162 MS: 1 InsertByte- 00:07:38.903 [2024-12-17 01:23:24.829772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.903 [2024-12-17 01:23:24.829801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.903 [2024-12-17 01:23:24.829854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.903 [2024-12-17 01:23:24.829867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.903 #54 NEW cov: 12413 ft: 14406 corp: 23/2476b lim: 320 exec/s: 54 rss: 74Mb L: 133/162 MS: 1 CopyPart- 00:07:38.903 [2024-12-17 01:23:24.869714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:38.903 [2024-12-17 01:23:24.869739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.903 #55 NEW cov: 12413 ft: 14487 corp: 24/2583b lim: 320 exec/s: 55 rss: 74Mb L: 107/162 MS: 1 CopyPart- 00:07:39.162 [2024-12-17 01:23:24.909955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.162 [2024-12-17 01:23:24.909980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.162 [2024-12-17 01:23:24.910040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.162 [2024-12-17 01:23:24.910054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.162 #56 NEW cov: 12413 ft: 14503 corp: 25/2745b lim: 320 exec/s: 56 rss: 74Mb L: 162/162 MS: 1 CrossOver- 00:07:39.162 [2024-12-17 01:23:24.950027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:1b1b1b1b cdw10:00000000 cdw11:1b1b1b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff00041b 00:07:39.162 [2024-12-17 01:23:24.950054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.162 #57 NEW cov: 12413 ft: 14512 corp: 26/2823b lim: 320 exec/s: 57 rss: 74Mb L: 78/162 MS: 1 CMP- DE: "\011\000"- 00:07:39.162 [2024-12-17 01:23:25.010120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.162 [2024-12-17 01:23:25.010146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.162 #58 NEW cov: 12413 ft: 14515 corp: 27/2901b lim: 320 exec/s: 58 rss: 74Mb L: 78/162 MS: 1 ChangeBinInt- 00:07:39.162 [2024-12-17 01:23:25.050250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.162 [2024-12-17 01:23:25.050277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.162 #59 NEW cov: 12413 ft: 14525 corp: 28/2979b lim: 320 exec/s: 59 rss: 74Mb L: 78/162 MS: 1 ChangeBit- 00:07:39.162 [2024-12-17 01:23:25.110427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.162 [2024-12-17 01:23:25.110453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.162 #60 NEW cov: 12413 ft: 14528 corp: 29/3057b lim: 320 exec/s: 60 rss: 74Mb L: 78/162 MS: 1 ShuffleBytes- 00:07:39.420 [2024-12-17 01:23:25.170584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.420 [2024-12-17 01:23:25.170610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.420 #61 NEW cov: 12413 ft: 14548 corp: 30/3173b lim: 320 exec/s: 61 rss: 74Mb L: 116/162 MS: 1 EraseBytes- 00:07:39.420 [2024-12-17 01:23:25.230941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.420 [2024-12-17 01:23:25.230970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.420 [2024-12-17 01:23:25.231033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:39.420 [2024-12-17 01:23:25.231049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.420 #62 NEW cov: 12413 ft: 14597 corp: 31/3335b lim: 320 exec/s: 62 rss: 74Mb L: 162/162 MS: 1 ChangeByte- 00:07:39.420 [2024-12-17 01:23:25.290924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.420 [2024-12-17 01:23:25.290951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.420 #63 NEW cov: 12413 ft: 14598 corp: 32/3413b lim: 320 exec/s: 63 rss: 74Mb L: 78/162 MS: 1 ShuffleBytes- 00:07:39.420 [2024-12-17 01:23:25.351075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b000000 cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff00041b 00:07:39.420 [2024-12-17 01:23:25.351100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.420 #64 NEW cov: 12413 ft: 14604 corp: 33/3489b lim: 320 exec/s: 64 rss: 74Mb L: 76/162 MS: 1 CopyPart- 00:07:39.420 [2024-12-17 01:23:25.391226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.420 [2024-12-17 01:23:25.391251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.420 [2024-12-17 01:23:25.391312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff6c119b234f 00:07:39.420 [2024-12-17 01:23:25.391325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.679 #65 NEW cov: 12413 ft: 14621 corp: 34/3651b lim: 320 exec/s: 65 rss: 74Mb L: 162/162 MS: 1 PersAutoDict- DE: "\377\004\311O#\233\021l"- 00:07:39.679 [2024-12-17 01:23:25.451354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b000000 cdw11:e4e4e21b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffff00041b 00:07:39.679 [2024-12-17 01:23:25.451379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.679 #71 NEW cov: 12413 ft: 14628 corp: 35/3727b lim: 320 exec/s: 35 rss: 74Mb L: 76/162 MS: 1 ChangeBinInt- 00:07:39.679 #71 DONE cov: 12413 ft: 14628 corp: 35/3727b lim: 320 exec/s: 35 rss: 74Mb 00:07:39.679 ###### Recommended dictionary. ###### 00:07:39.679 "\004\000\000\000\000\000\000\000" # Uses: 2 00:07:39.679 "\033\033\242\340I\311\005\000" # Uses: 0 00:07:39.679 "\377\004\311O#\233\021l" # Uses: 1 00:07:39.679 "\377\377\377\000" # Uses: 0 00:07:39.679 "\011\000" # Uses: 0 00:07:39.679 ###### End of recommended dictionary. ###### 00:07:39.679 Done 71 runs in 2 second(s) 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:39.679 01:23:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:39.679 [2024-12-17 01:23:25.654071] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:39.679 [2024-12-17 01:23:25.654152] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid823654 ] 00:07:39.937 [2024-12-17 01:23:25.911734] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.195 [2024-12-17 01:23:25.942835] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.195 [2024-12-17 01:23:25.995184] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.195 [2024-12-17 01:23:26.011512] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:40.195 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.195 INFO: Seed: 2591097246 00:07:40.195 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:40.195 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:40.195 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:40.195 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.195 #2 INITED exec/s: 0 rss: 64Mb 00:07:40.195 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.195 This may also happen if the target rejected all inputs we tried so far 00:07:40.195 [2024-12-17 01:23:26.056618] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.195 [2024-12-17 01:23:26.056733] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.195 [2024-12-17 01:23:26.056954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.195 [2024-12-17 01:23:26.056986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.195 [2024-12-17 01:23:26.057041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.195 [2024-12-17 01:23:26.057057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.453 NEW_FUNC[1/715]: 0x453088 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:40.453 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.453 #17 NEW cov: 12231 ft: 12223 corp: 2/15b lim: 30 exec/s: 0 rss: 72Mb L: 14/14 MS: 5 InsertByte-ShuffleBytes-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:40.453 [2024-12-17 01:23:26.387832] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.453 [2024-12-17 01:23:26.387971] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.453 [2024-12-17 01:23:26.388092] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.453 [2024-12-17 01:23:26.388210] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.453 [2024-12-17 01:23:26.388475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.453 [2024-12-17 01:23:26.388532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.453 [2024-12-17 01:23:26.388615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.453 [2024-12-17 01:23:26.388643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.453 [2024-12-17 01:23:26.388723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.453 [2024-12-17 01:23:26.388750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.453 [2024-12-17 01:23:26.388836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.453 [2024-12-17 01:23:26.388863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.453 #19 NEW cov: 12344 ft: 13539 corp: 3/42b lim: 30 exec/s: 0 rss: 72Mb L: 27/27 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:40.453 [2024-12-17 01:23:26.437633] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001d1e 00:07:40.453 [2024-12-17 01:23:26.437754] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.453 [2024-12-17 01:23:26.437979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.453 [2024-12-17 01:23:26.438007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.453 [2024-12-17 01:23:26.438061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1e1e81e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.453 [2024-12-17 01:23:26.438075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.711 #20 NEW cov: 12350 ft: 13818 corp: 4/56b lim: 30 exec/s: 0 rss: 72Mb L: 14/27 MS: 1 ChangeBinInt- 00:07:40.711 [2024-12-17 01:23:26.497763] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.711 [2024-12-17 01:23:26.497997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a1a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.711 [2024-12-17 01:23:26.498025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.711 #21 NEW cov: 12435 ft: 14489 corp: 5/67b lim: 30 exec/s: 0 rss: 72Mb L: 11/27 MS: 1 CrossOver- 00:07:40.711 [2024-12-17 01:23:26.537954] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.711 [2024-12-17 01:23:26.538079] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.711 [2024-12-17 01:23:26.538194] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.711 [2024-12-17 01:23:26.538313] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.711 [2024-12-17 01:23:26.538527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.711 [2024-12-17 01:23:26.538555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.711 [2024-12-17 01:23:26.538612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.711 [2024-12-17 01:23:26.538627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.712 [2024-12-17 01:23:26.538682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.538696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.712 [2024-12-17 01:23:26.538749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a3a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.538763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.712 #22 NEW cov: 12435 ft: 14633 corp: 6/94b lim: 30 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 ChangeBit- 00:07:40.712 [2024-12-17 01:23:26.598097] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.712 [2024-12-17 01:23:26.598219] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.712 [2024-12-17 01:23:26.598438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.598466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.712 [2024-12-17 01:23:26.598522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.598537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.712 #23 NEW cov: 12435 ft: 14757 corp: 7/108b lim: 30 exec/s: 0 rss: 72Mb L: 14/27 MS: 1 CrossOver- 00:07:40.712 [2024-12-17 01:23:26.638272] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.712 [2024-12-17 01:23:26.638392] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.712 [2024-12-17 01:23:26.638507] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.712 [2024-12-17 01:23:26.638617] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.712 [2024-12-17 01:23:26.638846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.638873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.712 [2024-12-17 01:23:26.638931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.638946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.712 [2024-12-17 01:23:26.639002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.639016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.712 [2024-12-17 01:23:26.639076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a3a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.639090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.712 #24 NEW cov: 12435 ft: 14809 corp: 8/135b lim: 30 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 ChangeBit- 00:07:40.712 [2024-12-17 01:23:26.698330] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.712 [2024-12-17 01:23:26.698547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.712 [2024-12-17 01:23:26.698572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.970 #25 NEW cov: 12435 ft: 14852 corp: 9/146b lim: 30 exec/s: 0 rss: 72Mb L: 11/27 MS: 1 ChangeBit- 00:07:40.970 [2024-12-17 01:23:26.758586] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e11d 00:07:40.970 [2024-12-17 01:23:26.758708] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.970 [2024-12-17 01:23:26.758837] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000e1e1 00:07:40.970 [2024-12-17 01:23:26.759064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.759092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.970 [2024-12-17 01:23:26.759149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1e1e811e cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.759164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.970 [2024-12-17 01:23:26.759218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1d1e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.759232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.970 #26 NEW cov: 12435 ft: 15087 corp: 10/168b lim: 30 exec/s: 0 rss: 72Mb L: 22/27 MS: 1 CopyPart- 00:07:40.970 [2024-12-17 01:23:26.818757] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.970 [2024-12-17 01:23:26.818887] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.970 [2024-12-17 01:23:26.818998] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.970 [2024-12-17 01:23:26.819104] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.970 [2024-12-17 01:23:26.819322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.819350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.970 [2024-12-17 01:23:26.819407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.819421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.970 [2024-12-17 01:23:26.819475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.819489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.970 [2024-12-17 01:23:26.819544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a2a023a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.819561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.970 #27 NEW cov: 12435 ft: 15114 corp: 11/195b lim: 30 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 CopyPart- 00:07:40.970 [2024-12-17 01:23:26.858807] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001d1e 00:07:40.970 [2024-12-17 01:23:26.858938] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.970 [2024-12-17 01:23:26.859166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181ec cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.859193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.970 [2024-12-17 01:23:26.859252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1e1e81e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.970 [2024-12-17 01:23:26.859266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.971 #28 NEW cov: 12435 ft: 15137 corp: 12/209b lim: 30 exec/s: 0 rss: 72Mb L: 14/27 MS: 1 ChangeByte- 00:07:40.971 [2024-12-17 01:23:26.899025] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.971 [2024-12-17 01:23:26.899148] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.971 [2024-12-17 01:23:26.899266] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.971 [2024-12-17 01:23:26.899389] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:40.971 [2024-12-17 01:23:26.899610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.971 [2024-12-17 01:23:26.899637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.971 [2024-12-17 01:23:26.899694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3a85022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.971 [2024-12-17 01:23:26.899708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.971 [2024-12-17 01:23:26.899762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.971 [2024-12-17 01:23:26.899776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.971 [2024-12-17 01:23:26.899827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a3a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.971 [2024-12-17 01:23:26.899841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.971 #34 NEW cov: 12435 ft: 15192 corp: 13/236b lim: 30 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 ChangeByte- 00:07:40.971 [2024-12-17 01:23:26.959180] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.971 [2024-12-17 01:23:26.959300] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:40.971 [2024-12-17 01:23:26.959410] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.971 [2024-12-17 01:23:26.959520] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:40.971 [2024-12-17 01:23:26.959741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.971 [2024-12-17 01:23:26.959768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.971 [2024-12-17 01:23:26.959827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.971 [2024-12-17 01:23:26.959843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.971 [2024-12-17 01:23:26.959898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:4aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.971 [2024-12-17 01:23:26.959913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.971 [2024-12-17 01:23:26.959968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.971 [2024-12-17 01:23:26.959982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.229 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:41.229 #35 NEW cov: 12458 ft: 15275 corp: 14/261b lim: 30 exec/s: 0 rss: 73Mb L: 25/27 MS: 1 InsertRepeatedBytes- 00:07:41.229 [2024-12-17 01:23:26.999259] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.229 [2024-12-17 01:23:26.999378] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.229 [2024-12-17 01:23:26.999491] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.229 [2024-12-17 01:23:26.999604] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.229 [2024-12-17 01:23:26.999817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.229 [2024-12-17 01:23:26.999844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.229 [2024-12-17 01:23:26.999897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.229 [2024-12-17 01:23:26.999911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.229 [2024-12-17 01:23:26.999963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.229 [2024-12-17 01:23:26.999977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.229 [2024-12-17 01:23:27.000028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.229 [2024-12-17 01:23:27.000042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.229 #36 NEW cov: 12458 ft: 15289 corp: 15/290b lim: 30 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 CrossOver- 00:07:41.229 [2024-12-17 01:23:27.039317] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001d1e 00:07:41.229 [2024-12-17 01:23:27.039443] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:41.229 [2024-12-17 01:23:27.039683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181ec cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.229 [2024-12-17 01:23:27.039710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.039766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1e1e81e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.039781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.230 #37 NEW cov: 12458 ft: 15331 corp: 16/304b lim: 30 exec/s: 37 rss: 73Mb L: 14/29 MS: 1 CopyPart- 00:07:41.230 [2024-12-17 01:23:27.099611] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.230 [2024-12-17 01:23:27.099737] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.230 [2024-12-17 01:23:27.099858] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.230 [2024-12-17 01:23:27.099966] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.230 [2024-12-17 01:23:27.100078] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000e14a 00:07:41.230 [2024-12-17 01:23:27.100298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.100325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.100380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:332a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.100394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.100448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.100461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.100518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.100531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.100583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.100597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.230 #38 NEW cov: 12458 ft: 15441 corp: 17/334b lim: 30 exec/s: 38 rss: 73Mb L: 30/30 MS: 1 InsertByte- 00:07:41.230 [2024-12-17 01:23:27.159722] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e11d 00:07:41.230 [2024-12-17 01:23:27.159851] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:41.230 [2024-12-17 01:23:27.159981] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000e1e1 00:07:41.230 [2024-12-17 01:23:27.160093] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.230 [2024-12-17 01:23:27.160315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.160342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.160399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1e1e811e cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.160415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.160471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1d1e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.160485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.160542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:e1e183ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.160560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.230 #39 NEW cov: 12458 ft: 15456 corp: 18/361b lim: 30 exec/s: 39 rss: 73Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:41.230 [2024-12-17 01:23:27.219868] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:41.230 [2024-12-17 01:23:27.219990] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:41.230 [2024-12-17 01:23:27.220202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:03e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.220229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.230 [2024-12-17 01:23:27.220288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.230 [2024-12-17 01:23:27.220302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.489 #40 NEW cov: 12458 ft: 15499 corp: 19/375b lim: 30 exec/s: 40 rss: 73Mb L: 14/30 MS: 1 ChangeByte- 00:07:41.489 [2024-12-17 01:23:27.260005] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.489 [2024-12-17 01:23:27.260128] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.489 [2024-12-17 01:23:27.260240] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.489 [2024-12-17 01:23:27.260349] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.489 [2024-12-17 01:23:27.260566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.260594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.260653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.260668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.260725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a0228 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.260739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.260797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a3a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.260811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.489 #41 NEW cov: 12458 ft: 15521 corp: 20/402b lim: 30 exec/s: 41 rss: 73Mb L: 27/30 MS: 1 ChangeBit- 00:07:41.489 [2024-12-17 01:23:27.300099] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.489 [2024-12-17 01:23:27.300223] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.489 [2024-12-17 01:23:27.300336] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000282a 00:07:41.489 [2024-12-17 01:23:27.300452] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.489 [2024-12-17 01:23:27.300687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.300714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.300771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.300789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.300911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.300926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.300980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a2a023a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.300994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.489 #42 NEW cov: 12458 ft: 15537 corp: 21/429b lim: 30 exec/s: 42 rss: 73Mb L: 27/30 MS: 1 ChangeBit- 00:07:41.489 [2024-12-17 01:23:27.360242] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:41.489 [2024-12-17 01:23:27.360364] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:41.489 [2024-12-17 01:23:27.360582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:03e181e0 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.360608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.360663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.360678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.489 #43 NEW cov: 12458 ft: 15540 corp: 22/443b lim: 30 exec/s: 43 rss: 73Mb L: 14/30 MS: 1 ChangeBit- 00:07:41.489 [2024-12-17 01:23:27.420427] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e11d 00:07:41.489 [2024-12-17 01:23:27.420556] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:41.489 [2024-12-17 01:23:27.420690] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000e1e1 00:07:41.489 [2024-12-17 01:23:27.420917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.420945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.421003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:1e1e811e cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.421018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.421077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1d1e021e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.421091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.489 #44 NEW cov: 12458 ft: 15624 corp: 23/465b lim: 30 exec/s: 44 rss: 73Mb L: 22/30 MS: 1 CrossOver- 00:07:41.489 [2024-12-17 01:23:27.460609] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300005757 00:07:41.489 [2024-12-17 01:23:27.460735] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300005757 00:07:41.489 [2024-12-17 01:23:27.460855] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:41.489 [2024-12-17 01:23:27.460968] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004aaa 00:07:41.489 [2024-12-17 01:23:27.461190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:03e183e1 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.461219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.461277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:57578357 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.461292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.461348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:57e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.461363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.489 [2024-12-17 01:23:27.461419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.489 [2024-12-17 01:23:27.461433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.489 #45 NEW cov: 12458 ft: 15630 corp: 24/489b lim: 30 exec/s: 45 rss: 73Mb L: 24/30 MS: 1 InsertRepeatedBytes- 00:07:41.748 [2024-12-17 01:23:27.500740] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.748 [2024-12-17 01:23:27.500873] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.748 [2024-12-17 01:23:27.500992] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000282a 00:07:41.748 [2024-12-17 01:23:27.501106] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.748 [2024-12-17 01:23:27.501222] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.748 [2024-12-17 01:23:27.501452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.501479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.748 [2024-12-17 01:23:27.501535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.501550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.748 [2024-12-17 01:23:27.501606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.501620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.748 [2024-12-17 01:23:27.501675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a2a023a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.501688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.748 [2024-12-17 01:23:27.501745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.501759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.748 #46 NEW cov: 12458 ft: 15650 corp: 25/519b lim: 30 exec/s: 46 rss: 73Mb L: 30/30 MS: 1 CrossOver- 00:07:41.748 [2024-12-17 01:23:27.560889] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.748 [2024-12-17 01:23:27.561008] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.748 [2024-12-17 01:23:27.561123] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.748 [2024-12-17 01:23:27.561356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a5c83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.561383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.748 [2024-12-17 01:23:27.561439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.561453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.748 [2024-12-17 01:23:27.561511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.561526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.748 #50 NEW cov: 12458 ft: 15655 corp: 26/537b lim: 30 exec/s: 50 rss: 73Mb L: 18/30 MS: 4 CopyPart-InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:41.748 [2024-12-17 01:23:27.600988] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.748 [2024-12-17 01:23:27.601113] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.748 [2024-12-17 01:23:27.601232] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.748 [2024-12-17 01:23:27.601349] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:41.748 [2024-12-17 01:23:27.601584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.748 [2024-12-17 01:23:27.601612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.749 [2024-12-17 01:23:27.601669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.749 [2024-12-17 01:23:27.601684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.749 [2024-12-17 01:23:27.601751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.749 [2024-12-17 01:23:27.601765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.749 [2024-12-17 01:23:27.601823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a3a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.749 [2024-12-17 01:23:27.601837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.749 #51 NEW cov: 12458 ft: 15673 corp: 27/564b lim: 30 exec/s: 51 rss: 73Mb L: 27/30 MS: 1 CrossOver- 00:07:41.749 [2024-12-17 01:23:27.641073] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.749 [2024-12-17 01:23:27.641198] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.749 [2024-12-17 01:23:27.641310] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.749 [2024-12-17 01:23:27.641533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a5c83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.749 [2024-12-17 01:23:27.641560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.749 [2024-12-17 01:23:27.641618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.749 [2024-12-17 01:23:27.641633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.749 [2024-12-17 01:23:27.641693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.749 [2024-12-17 01:23:27.641707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.749 #52 NEW cov: 12458 ft: 15686 corp: 28/584b lim: 30 exec/s: 52 rss: 73Mb L: 20/30 MS: 1 CopyPart- 00:07:41.749 [2024-12-17 01:23:27.701203] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:41.749 [2024-12-17 01:23:27.701425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.749 [2024-12-17 01:23:27.701450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.749 #53 NEW cov: 12458 ft: 15737 corp: 29/594b lim: 30 exec/s: 53 rss: 74Mb L: 10/30 MS: 1 EraseBytes- 00:07:42.007 [2024-12-17 01:23:27.761448] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300002a2a 00:07:42.007 [2024-12-17 01:23:27.761573] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:42.007 [2024-12-17 01:23:27.761688] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:42.007 [2024-12-17 01:23:27.761812] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:42.007 [2024-12-17 01:23:27.762039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a2a8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.762066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.762124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.762139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.762195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.762209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.762267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:2a3a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.762282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.008 #54 NEW cov: 12458 ft: 15743 corp: 30/621b lim: 30 exec/s: 54 rss: 74Mb L: 27/30 MS: 1 ChangeBinInt- 00:07:42.008 [2024-12-17 01:23:27.801437] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:42.008 [2024-12-17 01:23:27.801653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a1a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.801680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.008 #55 NEW cov: 12458 ft: 15759 corp: 31/628b lim: 30 exec/s: 55 rss: 74Mb L: 7/30 MS: 1 EraseBytes- 00:07:42.008 [2024-12-17 01:23:27.841663] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:42.008 [2024-12-17 01:23:27.841784] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a99 00:07:42.008 [2024-12-17 01:23:27.841903] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:42.008 [2024-12-17 01:23:27.842010] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:42.008 [2024-12-17 01:23:27.842117] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:42.008 [2024-12-17 01:23:27.842341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.842368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.842426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.842441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.842496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.842510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.842567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.842582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.842639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.842653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.008 #56 NEW cov: 12458 ft: 15771 corp: 32/658b lim: 30 exec/s: 56 rss: 74Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:42.008 [2024-12-17 01:23:27.901761] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.008 [2024-12-17 01:23:27.901890] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.008 [2024-12-17 01:23:27.901998] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.008 [2024-12-17 01:23:27.902220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:1a5c83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.902247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.902305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffef83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.902320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.902376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.902391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.008 #57 NEW cov: 12458 ft: 15776 corp: 33/678b lim: 30 exec/s: 57 rss: 74Mb L: 20/30 MS: 1 ChangeBit- 00:07:42.008 [2024-12-17 01:23:27.941928] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200002a2a 00:07:42.008 [2024-12-17 01:23:27.942046] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200009999 00:07:42.008 [2024-12-17 01:23:27.942154] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:42.008 [2024-12-17 01:23:27.942265] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:42.008 [2024-12-17 01:23:27.942371] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009999 00:07:42.008 [2024-12-17 01:23:27.942594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.942624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.942680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:2a2a022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.942694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.942752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.942766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.942824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.942838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:27.942895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:99998199 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:27.942909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.008 #58 NEW cov: 12458 ft: 15799 corp: 34/708b lim: 30 exec/s: 58 rss: 74Mb L: 30/30 MS: 1 CopyPart- 00:07:42.008 [2024-12-17 01:23:28.002006] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:42.008 [2024-12-17 01:23:28.002125] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:42.008 [2024-12-17 01:23:28.002347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:28.002374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.008 [2024-12-17 01:23:28.002433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.008 [2024-12-17 01:23:28.002448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.267 #59 NEW cov: 12458 ft: 15842 corp: 35/722b lim: 30 exec/s: 59 rss: 74Mb L: 14/30 MS: 1 CopyPart- 00:07:42.267 [2024-12-17 01:23:28.042179] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300005757 00:07:42.267 [2024-12-17 01:23:28.042301] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300005757 00:07:42.267 [2024-12-17 01:23:28.042411] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e1e1 00:07:42.267 [2024-12-17 01:23:28.042533] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000e14a 00:07:42.267 [2024-12-17 01:23:28.042751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:03e183e1 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.267 [2024-12-17 01:23:28.042778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.267 [2024-12-17 01:23:28.042839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:57ec8357 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.267 [2024-12-17 01:23:28.042854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.267 [2024-12-17 01:23:28.042912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:575781e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.267 [2024-12-17 01:23:28.042926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.267 [2024-12-17 01:23:28.042986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:e1e181e1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.267 [2024-12-17 01:23:28.043001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.267 #60 NEW cov: 12458 ft: 15860 corp: 36/747b lim: 30 exec/s: 30 rss: 74Mb L: 25/30 MS: 1 InsertByte- 00:07:42.267 #60 DONE cov: 12458 ft: 15860 corp: 36/747b lim: 30 exec/s: 30 rss: 74Mb 00:07:42.267 Done 60 runs in 2 second(s) 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:42.267 01:23:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:42.267 [2024-12-17 01:23:28.235228] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:42.267 [2024-12-17 01:23:28.235294] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid823991 ] 00:07:42.525 [2024-12-17 01:23:28.414637] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.525 [2024-12-17 01:23:28.436976] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.525 [2024-12-17 01:23:28.489818] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:42.525 [2024-12-17 01:23:28.506139] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:42.525 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.525 INFO: Seed: 790137080 00:07:42.783 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:42.783 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:42.783 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:42.783 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.783 #2 INITED exec/s: 0 rss: 65Mb 00:07:42.783 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.783 This may also happen if the target rejected all inputs we tried so far 00:07:42.783 [2024-12-17 01:23:28.576870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.783 [2024-12-17 01:23:28.576908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.783 [2024-12-17 01:23:28.577024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.783 [2024-12-17 01:23:28.577043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.783 [2024-12-17 01:23:28.577164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.783 [2024-12-17 01:23:28.577181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.783 [2024-12-17 01:23:28.577303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.783 [2024-12-17 01:23:28.577322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.041 NEW_FUNC[1/714]: 0x455b38 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:43.041 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.041 #3 NEW cov: 12187 ft: 12183 corp: 2/34b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:43.041 [2024-12-17 01:23:28.906820] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.041 [2024-12-17 01:23:28.907155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.907192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.041 [2024-12-17 01:23:28.907304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.907324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.041 #7 NEW cov: 12311 ft: 13361 corp: 3/54b lim: 35 exec/s: 0 rss: 72Mb L: 20/33 MS: 4 InsertByte-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:07:43.041 [2024-12-17 01:23:28.947750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.947778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.041 [2024-12-17 01:23:28.947928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.947946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.041 [2024-12-17 01:23:28.948060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.948078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.041 [2024-12-17 01:23:28.948194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.948216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.041 #11 NEW cov: 12317 ft: 13700 corp: 4/88b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 4 ChangeByte-CopyPart-CMP-InsertRepeatedBytes- DE: "\376\377"- 00:07:43.041 [2024-12-17 01:23:28.987800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.987828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.041 [2024-12-17 01:23:28.987956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.987977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.041 [2024-12-17 01:23:28.988094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.988112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.041 [2024-12-17 01:23:28.988231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.041 [2024-12-17 01:23:28.988249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.041 #12 NEW cov: 12402 ft: 13950 corp: 5/122b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 InsertByte- 00:07:43.300 [2024-12-17 01:23:29.047216] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.300 [2024-12-17 01:23:29.047577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.047606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.300 [2024-12-17 01:23:29.047726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:fc000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.047749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.300 #13 NEW cov: 12402 ft: 14050 corp: 6/142b lim: 35 exec/s: 0 rss: 72Mb L: 20/34 MS: 1 ChangeBinInt- 00:07:43.300 [2024-12-17 01:23:29.117337] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.300 [2024-12-17 01:23:29.117510] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.300 [2024-12-17 01:23:29.117657] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.300 [2024-12-17 01:23:29.118006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.118036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.300 [2024-12-17 01:23:29.118151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.118172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.300 [2024-12-17 01:23:29.118289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.118312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.300 #14 NEW cov: 12402 ft: 14360 corp: 7/164b lim: 35 exec/s: 0 rss: 72Mb L: 22/34 MS: 1 InsertRepeatedBytes- 00:07:43.300 [2024-12-17 01:23:29.168325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.168352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.300 [2024-12-17 01:23:29.168499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.168517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.300 [2024-12-17 01:23:29.168633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.168650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.300 [2024-12-17 01:23:29.168764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.168779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.300 #15 NEW cov: 12402 ft: 14415 corp: 8/197b lim: 35 exec/s: 0 rss: 72Mb L: 33/34 MS: 1 ChangeBinInt- 00:07:43.300 [2024-12-17 01:23:29.217685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:feff000a cdw11:00000300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.217714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.300 #17 NEW cov: 12402 ft: 14757 corp: 9/208b lim: 35 exec/s: 0 rss: 72Mb L: 11/34 MS: 2 PersAutoDict-CMP- DE: "\376\377"-"\003\000\000\000\000\000\000\000"- 00:07:43.300 [2024-12-17 01:23:29.267708] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.300 [2024-12-17 01:23:29.267915] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.300 [2024-12-17 01:23:29.268387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.268421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.300 [2024-12-17 01:23:29.268536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:de000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.268559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.300 [2024-12-17 01:23:29.268674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:a84c00bd cdw11:0000c905 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.300 [2024-12-17 01:23:29.268690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.559 #18 NEW cov: 12402 ft: 14958 corp: 10/230b lim: 35 exec/s: 0 rss: 72Mb L: 22/34 MS: 1 CMP- DE: "\336g\275\250L\311\005\000"- 00:07:43.559 [2024-12-17 01:23:29.337989] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.559 [2024-12-17 01:23:29.338599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede0000 cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.338632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.338753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.338775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.338888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:a84c00bd cdw11:0000c905 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.338905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.559 #19 NEW cov: 12402 ft: 15003 corp: 11/252b lim: 35 exec/s: 0 rss: 72Mb L: 22/34 MS: 1 CrossOver- 00:07:43.559 [2024-12-17 01:23:29.409006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.409034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.409150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3bde00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.409167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.409290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.409307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.409431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.409448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.559 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:43.559 #20 NEW cov: 12425 ft: 15033 corp: 12/286b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 ChangeByte- 00:07:43.559 [2024-12-17 01:23:29.479338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:2223000a cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.479364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.479471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3bde00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.479489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.479602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.479619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.479743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.479759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.559 #21 NEW cov: 12425 ft: 15046 corp: 13/320b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 ChangeBinInt- 00:07:43.559 [2024-12-17 01:23:29.539445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de0094de SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.539470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.539587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3bde00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.539604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.539721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.539736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.559 [2024-12-17 01:23:29.539849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.559 [2024-12-17 01:23:29.539871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.559 #22 NEW cov: 12425 ft: 15118 corp: 14/354b lim: 35 exec/s: 22 rss: 72Mb L: 34/34 MS: 1 ChangeByte- 00:07:43.824 [2024-12-17 01:23:29.588951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:de6700ff cdw11:4c00bda8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.824 [2024-12-17 01:23:29.588977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.824 #24 NEW cov: 12425 ft: 15218 corp: 15/363b lim: 35 exec/s: 24 rss: 72Mb L: 9/34 MS: 2 CrossOver-PersAutoDict- DE: "\336g\275\250L\311\005\000"- 00:07:43.824 [2024-12-17 01:23:29.628879] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.824 [2024-12-17 01:23:29.629045] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.824 [2024-12-17 01:23:29.629198] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.824 [2024-12-17 01:23:29.629544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-12-17 01:23:29.629578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.825 [2024-12-17 01:23:29.629698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-12-17 01:23:29.629720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.825 [2024-12-17 01:23:29.629823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-12-17 01:23:29.629845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.825 #25 NEW cov: 12425 ft: 15225 corp: 16/385b lim: 35 exec/s: 25 rss: 72Mb L: 22/34 MS: 1 CopyPart- 00:07:43.825 [2024-12-17 01:23:29.669755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-12-17 01:23:29.669783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.825 [2024-12-17 01:23:29.669912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-12-17 01:23:29.669932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.825 [2024-12-17 01:23:29.670044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-12-17 01:23:29.670061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.825 [2024-12-17 01:23:29.670188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.825 [2024-12-17 01:23:29.670204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.825 #26 NEW cov: 12425 ft: 15271 corp: 17/419b lim: 35 exec/s: 26 rss: 72Mb L: 34/34 MS: 1 ChangeByte- 00:07:43.826 [2024-12-17 01:23:29.719092] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.826 [2024-12-17 01:23:29.719256] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.826 [2024-12-17 01:23:29.719411] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:43.826 [2024-12-17 01:23:29.719751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-12-17 01:23:29.719782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.826 [2024-12-17 01:23:29.719900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-12-17 01:23:29.719927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.826 [2024-12-17 01:23:29.720050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:92000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-12-17 01:23:29.720074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.826 #27 NEW cov: 12425 ft: 15278 corp: 18/441b lim: 35 exec/s: 27 rss: 72Mb L: 22/34 MS: 1 ChangeByte- 00:07:43.826 [2024-12-17 01:23:29.769628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:de6700ff cdw11:6700bda8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-12-17 01:23:29.769658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.826 [2024-12-17 01:23:29.769771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:4cc900a8 cdw11:4c000500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.826 [2024-12-17 01:23:29.769803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.826 #33 NEW cov: 12425 ft: 15310 corp: 19/457b lim: 35 exec/s: 33 rss: 72Mb L: 16/34 MS: 1 CopyPart- 00:07:44.086 [2024-12-17 01:23:29.840635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:de40000a cdw11:de00de94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.840662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.840779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:de3b00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.840806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.840931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.840950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.841064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.841082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.841202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ddde00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.841219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.086 #34 NEW cov: 12425 ft: 15350 corp: 20/492b lim: 35 exec/s: 34 rss: 72Mb L: 35/35 MS: 1 InsertByte- 00:07:44.086 [2024-12-17 01:23:29.910110] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.086 [2024-12-17 01:23:29.910274] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.086 [2024-12-17 01:23:29.910630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000018 cdw11:70000070 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.910657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.910775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70700070 cdw11:70007070 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.910796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.910910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.910934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.911050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:fc000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.911069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.086 #35 NEW cov: 12425 ft: 15374 corp: 21/522b lim: 35 exec/s: 35 rss: 72Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:07:44.086 [2024-12-17 01:23:29.980803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.980840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.980958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.086 [2024-12-17 01:23:29.980975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.086 [2024-12-17 01:23:29.981093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.087 [2024-12-17 01:23:29.981110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.087 [2024-12-17 01:23:29.981223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.087 [2024-12-17 01:23:29.981240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.087 #36 NEW cov: 12425 ft: 15402 corp: 22/556b lim: 35 exec/s: 36 rss: 72Mb L: 34/35 MS: 1 InsertByte- 00:07:44.087 [2024-12-17 01:23:30.030900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:de22000a cdw11:20002121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.087 [2024-12-17 01:23:30.030927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.087 [2024-12-17 01:23:30.031040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.087 [2024-12-17 01:23:30.031070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.087 [2024-12-17 01:23:30.031188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.087 [2024-12-17 01:23:30.031205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.087 [2024-12-17 01:23:30.031317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.087 [2024-12-17 01:23:30.031334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.087 #37 NEW cov: 12425 ft: 15406 corp: 23/590b lim: 35 exec/s: 37 rss: 72Mb L: 34/35 MS: 1 ChangeBinInt- 00:07:44.345 [2024-12-17 01:23:30.101156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.101187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.101305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.101324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.101446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.101464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.101585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.101604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.345 #38 NEW cov: 12425 ft: 15426 corp: 24/624b lim: 35 exec/s: 38 rss: 73Mb L: 34/35 MS: 1 CopyPart- 00:07:44.345 [2024-12-17 01:23:30.171652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:de40000a cdw11:de00de94 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.171678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.171784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:de3b00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.171806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.171922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.171938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.172058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.172073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.172191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ddde00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.172206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.345 #39 NEW cov: 12425 ft: 15449 corp: 25/659b lim: 35 exec/s: 39 rss: 73Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:44.345 [2024-12-17 01:23:30.241921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:de22000a cdw11:21002921 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.241950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.242091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.242108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.242235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.242252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.242376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.242392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.242526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.242545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.345 #40 NEW cov: 12425 ft: 15462 corp: 26/694b lim: 35 exec/s: 40 rss: 73Mb L: 35/35 MS: 1 InsertByte- 00:07:44.345 [2024-12-17 01:23:30.311818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.311844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.311972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.312000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.312119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.312137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.345 [2024-12-17 01:23:30.312261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.345 [2024-12-17 01:23:30.312279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.345 #41 NEW cov: 12425 ft: 15464 corp: 27/727b lim: 35 exec/s: 41 rss: 73Mb L: 33/35 MS: 1 ShuffleBytes- 00:07:44.604 [2024-12-17 01:23:30.361984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dede000a cdw11:de0094de SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.362010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.362147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3bde00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.362164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.362283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.362301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.362424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:dede00de cdw11:de00dede SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.362442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.604 #42 NEW cov: 12425 ft: 15478 corp: 28/761b lim: 35 exec/s: 42 rss: 73Mb L: 34/35 MS: 1 ShuffleBytes- 00:07:44.604 [2024-12-17 01:23:30.411251] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.604 [2024-12-17 01:23:30.411426] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.604 [2024-12-17 01:23:30.411578] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.604 [2024-12-17 01:23:30.411943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.411978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.412098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.412117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.412237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff0000fe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.412262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.604 #43 NEW cov: 12425 ft: 15506 corp: 29/783b lim: 35 exec/s: 43 rss: 73Mb L: 22/35 MS: 1 PersAutoDict- DE: "\376\377"- 00:07:44.604 [2024-12-17 01:23:30.481846] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.604 [2024-12-17 01:23:30.482008] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.604 [2024-12-17 01:23:30.482345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000018 cdw11:70000070 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.482373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.482496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70700070 cdw11:70007070 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.482514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.482627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.482650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.482772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000fc00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.482797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.604 #44 NEW cov: 12425 ft: 15518 corp: 30/815b lim: 35 exec/s: 44 rss: 73Mb L: 32/35 MS: 1 CrossOver- 00:07:44.604 [2024-12-17 01:23:30.551782] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.604 [2024-12-17 01:23:30.552111] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.604 [2024-12-17 01:23:30.552270] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:44.604 [2024-12-17 01:23:30.552660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:70001e70 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.552693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.552810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:70700070 cdw11:70007070 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.552827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.552943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.552965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.604 [2024-12-17 01:23:30.553083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:fc000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.604 [2024-12-17 01:23:30.553107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.604 #45 NEW cov: 12425 ft: 15532 corp: 31/845b lim: 35 exec/s: 22 rss: 73Mb L: 30/35 MS: 1 ChangeBinInt- 00:07:44.604 #45 DONE cov: 12425 ft: 15532 corp: 31/845b lim: 35 exec/s: 22 rss: 73Mb 00:07:44.604 ###### Recommended dictionary. ###### 00:07:44.604 "\376\377" # Uses: 2 00:07:44.604 "\003\000\000\000\000\000\000\000" # Uses: 0 00:07:44.604 "\336g\275\250L\311\005\000" # Uses: 1 00:07:44.604 ###### End of recommended dictionary. ###### 00:07:44.604 Done 45 runs in 2 second(s) 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:44.863 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:44.864 01:23:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:44.864 [2024-12-17 01:23:30.738475] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:44.864 [2024-12-17 01:23:30.738538] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid824479 ] 00:07:45.122 [2024-12-17 01:23:30.985796] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.122 [2024-12-17 01:23:31.016341] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.122 [2024-12-17 01:23:31.068624] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.122 [2024-12-17 01:23:31.084953] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:45.122 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.122 INFO: Seed: 3370148212 00:07:45.122 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:45.122 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:45.122 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:45.122 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.122 #2 INITED exec/s: 0 rss: 64Mb 00:07:45.122 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.122 This may also happen if the target rejected all inputs we tried so far 00:07:45.380 [2024-12-17 01:23:31.129880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.380 [2024-12-17 01:23:31.129925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.638 NEW_FUNC[1/722]: 0x457818 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:45.638 NEW_FUNC[2/722]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:45.638 #5 NEW cov: 12391 ft: 12389 corp: 2/10b lim: 20 exec/s: 0 rss: 72Mb L: 9/9 MS: 3 ChangeBit-ShuffleBytes-CMP- DE: "\000\000\000\000\000\000\000p"- 00:07:45.638 [2024-12-17 01:23:31.490818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.639 [2024-12-17 01:23:31.490859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.639 #7 NEW cov: 12504 ft: 12983 corp: 3/19b lim: 20 exec/s: 0 rss: 72Mb L: 9/9 MS: 2 ChangeByte-PersAutoDict- DE: "\000\000\000\000\000\000\000p"- 00:07:45.639 [2024-12-17 01:23:31.550825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.639 [2024-12-17 01:23:31.550861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.639 NEW_FUNC[1/1]: 0x156d788 in _nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3649 00:07:45.639 #23 NEW cov: 12537 ft: 13329 corp: 4/28b lim: 20 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000p"- 00:07:45.897 [2024-12-17 01:23:31.651192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.897 [2024-12-17 01:23:31.651228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.897 #24 NEW cov: 12645 ft: 14023 corp: 5/46b lim: 20 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 CopyPart- 00:07:45.897 [2024-12-17 01:23:31.751253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.897 [2024-12-17 01:23:31.751295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.897 #25 NEW cov: 12645 ft: 14483 corp: 6/52b lim: 20 exec/s: 0 rss: 72Mb L: 6/18 MS: 1 EraseBytes- 00:07:45.897 [2024-12-17 01:23:31.841576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.897 [2024-12-17 01:23:31.841608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.155 #26 NEW cov: 12645 ft: 14657 corp: 7/70b lim: 20 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 ChangeBinInt- 00:07:46.155 [2024-12-17 01:23:31.931948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.155 [2024-12-17 01:23:31.931979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.155 #27 NEW cov: 12645 ft: 14905 corp: 8/90b lim: 20 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 CopyPart- 00:07:46.155 [2024-12-17 01:23:32.021955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.155 [2024-12-17 01:23:32.021985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.155 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:46.155 #28 NEW cov: 12662 ft: 14963 corp: 9/99b lim: 20 exec/s: 0 rss: 72Mb L: 9/20 MS: 1 ShuffleBytes- 00:07:46.155 [2024-12-17 01:23:32.082221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.155 [2024-12-17 01:23:32.082253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.155 #29 NEW cov: 12662 ft: 15050 corp: 10/118b lim: 20 exec/s: 29 rss: 72Mb L: 19/20 MS: 1 InsertByte- 00:07:46.155 [2024-12-17 01:23:32.142389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.155 [2024-12-17 01:23:32.142421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.414 #30 NEW cov: 12662 ft: 15100 corp: 11/136b lim: 20 exec/s: 30 rss: 72Mb L: 18/20 MS: 1 ChangeByte- 00:07:46.414 [2024-12-17 01:23:32.192491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.414 [2024-12-17 01:23:32.192521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.414 #31 NEW cov: 12662 ft: 15144 corp: 12/152b lim: 20 exec/s: 31 rss: 72Mb L: 16/20 MS: 1 CopyPart- 00:07:46.414 #32 NEW cov: 12662 ft: 15192 corp: 13/161b lim: 20 exec/s: 32 rss: 72Mb L: 9/20 MS: 1 ChangeBinInt- 00:07:46.414 [2024-12-17 01:23:32.312817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.414 [2024-12-17 01:23:32.312848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.414 #33 NEW cov: 12662 ft: 15268 corp: 14/179b lim: 20 exec/s: 33 rss: 72Mb L: 18/20 MS: 1 ChangeByte- 00:07:46.414 [2024-12-17 01:23:32.362833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.414 [2024-12-17 01:23:32.362863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.414 #34 NEW cov: 12662 ft: 15334 corp: 15/188b lim: 20 exec/s: 34 rss: 72Mb L: 9/20 MS: 1 ChangeByte- 00:07:46.415 [2024-12-17 01:23:32.413163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.415 [2024-12-17 01:23:32.413198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.673 #35 NEW cov: 12662 ft: 15348 corp: 16/208b lim: 20 exec/s: 35 rss: 73Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:46.673 [2024-12-17 01:23:32.503322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.673 [2024-12-17 01:23:32.503354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.673 #36 NEW cov: 12662 ft: 15384 corp: 17/226b lim: 20 exec/s: 36 rss: 73Mb L: 18/20 MS: 1 EraseBytes- 00:07:46.673 [2024-12-17 01:23:32.593495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.673 [2024-12-17 01:23:32.593525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.673 #37 NEW cov: 12662 ft: 15396 corp: 18/235b lim: 20 exec/s: 37 rss: 73Mb L: 9/20 MS: 1 ShuffleBytes- 00:07:46.931 [2024-12-17 01:23:32.683846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.931 [2024-12-17 01:23:32.683877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.931 #38 NEW cov: 12662 ft: 15443 corp: 19/252b lim: 20 exec/s: 38 rss: 73Mb L: 17/20 MS: 1 EraseBytes- 00:07:46.931 [2024-12-17 01:23:32.743944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.931 [2024-12-17 01:23:32.743975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.931 #39 NEW cov: 12662 ft: 15521 corp: 20/270b lim: 20 exec/s: 39 rss: 73Mb L: 18/20 MS: 1 CopyPart- 00:07:46.931 [2024-12-17 01:23:32.804051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.931 [2024-12-17 01:23:32.804085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.931 #40 NEW cov: 12662 ft: 15535 corp: 21/279b lim: 20 exec/s: 40 rss: 73Mb L: 9/20 MS: 1 ChangeBinInt- 00:07:46.931 [2024-12-17 01:23:32.854309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.931 [2024-12-17 01:23:32.854341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.931 #41 NEW cov: 12662 ft: 15562 corp: 22/296b lim: 20 exec/s: 41 rss: 73Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:47.189 [2024-12-17 01:23:32.954612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.189 [2024-12-17 01:23:32.954646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.189 #42 NEW cov: 12669 ft: 15577 corp: 23/313b lim: 20 exec/s: 42 rss: 73Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:47.189 #43 NEW cov: 12670 ft: 15596 corp: 24/322b lim: 20 exec/s: 21 rss: 73Mb L: 9/20 MS: 1 ChangeBinInt- 00:07:47.189 #43 DONE cov: 12670 ft: 15596 corp: 24/322b lim: 20 exec/s: 21 rss: 73Mb 00:07:47.189 ###### Recommended dictionary. ###### 00:07:47.189 "\000\000\000\000\000\000\000p" # Uses: 2 00:07:47.189 ###### End of recommended dictionary. ###### 00:07:47.189 Done 43 runs in 2 second(s) 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:47.448 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:47.449 01:23:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:07:47.449 [2024-12-17 01:23:33.289273] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:47.449 [2024-12-17 01:23:33.289342] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid825009 ] 00:07:47.708 [2024-12-17 01:23:33.540297] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.708 [2024-12-17 01:23:33.569824] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.708 [2024-12-17 01:23:33.622118] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.708 [2024-12-17 01:23:33.638443] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:47.708 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.708 INFO: Seed: 1628185912 00:07:47.708 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:47.708 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:47.708 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:47.708 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.708 #2 INITED exec/s: 0 rss: 64Mb 00:07:47.708 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.708 This may also happen if the target rejected all inputs we tried so far 00:07:47.708 [2024-12-17 01:23:33.684149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.708 [2024-12-17 01:23:33.684177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.708 [2024-12-17 01:23:33.684234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.708 [2024-12-17 01:23:33.684248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.708 [2024-12-17 01:23:33.684300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.708 [2024-12-17 01:23:33.684316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.225 NEW_FUNC[1/715]: 0x458918 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:48.225 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.225 #6 NEW cov: 12208 ft: 12202 corp: 2/22b lim: 35 exec/s: 0 rss: 72Mb L: 21/21 MS: 4 ShuffleBytes-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:48.225 [2024-12-17 01:23:33.995156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:33.995189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:33.995244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:33.995258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:33.995313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:33.995327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:33.995383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:33.995396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.225 #11 NEW cov: 12321 ft: 13130 corp: 3/55b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 5 ChangeByte-ShuffleBytes-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:07:48.225 [2024-12-17 01:23:34.034880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.034907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:34.034965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.034978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.225 #15 NEW cov: 12327 ft: 13572 corp: 4/74b lim: 35 exec/s: 0 rss: 72Mb L: 19/33 MS: 4 InsertByte-ChangeByte-EraseBytes-InsertRepeatedBytes- 00:07:48.225 [2024-12-17 01:23:34.075322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.075352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:34.075409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.075423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:34.075480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.075495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:34.075551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.075567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.225 #16 NEW cov: 12412 ft: 13939 corp: 5/107b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ChangeByte- 00:07:48.225 [2024-12-17 01:23:34.135335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.135362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:34.135416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.135431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:34.135485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.135499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.225 #19 NEW cov: 12412 ft: 14130 corp: 6/131b lim: 35 exec/s: 0 rss: 72Mb L: 24/33 MS: 3 ChangeBinInt-InsertByte-InsertRepeatedBytes- 00:07:48.225 [2024-12-17 01:23:34.175598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bfffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.175626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.225 [2024-12-17 01:23:34.175683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.225 [2024-12-17 01:23:34.175697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.226 [2024-12-17 01:23:34.175754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.226 [2024-12-17 01:23:34.175768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.226 [2024-12-17 01:23:34.175830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.226 [2024-12-17 01:23:34.175844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.226 #20 NEW cov: 12412 ft: 14232 corp: 7/164b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ChangeBit- 00:07:48.484 [2024-12-17 01:23:34.235782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.235813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.235888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.235902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.235959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.235973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.236028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.236044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.484 #21 NEW cov: 12412 ft: 14289 corp: 8/198b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 CopyPart- 00:07:48.484 [2024-12-17 01:23:34.275549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.275576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.275634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.275649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.484 #22 NEW cov: 12412 ft: 14340 corp: 9/217b lim: 35 exec/s: 0 rss: 72Mb L: 19/34 MS: 1 ShuffleBytes- 00:07:48.484 [2024-12-17 01:23:34.335702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.335730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.335795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.335810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.484 #23 NEW cov: 12412 ft: 14396 corp: 10/237b lim: 35 exec/s: 0 rss: 72Mb L: 20/34 MS: 1 EraseBytes- 00:07:48.484 [2024-12-17 01:23:34.376163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.376190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.376250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.376263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.376319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.376333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.376389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.376402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.484 #24 NEW cov: 12412 ft: 14483 corp: 11/271b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 ShuffleBytes- 00:07:48.484 [2024-12-17 01:23:34.435982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff040000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.436008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.484 [2024-12-17 01:23:34.436065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.484 [2024-12-17 01:23:34.436080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.484 #25 NEW cov: 12412 ft: 14522 corp: 12/291b lim: 35 exec/s: 0 rss: 72Mb L: 20/34 MS: 1 ChangeBinInt- 00:07:48.743 [2024-12-17 01:23:34.496499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bfc1ffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.496525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.743 [2024-12-17 01:23:34.496600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.496614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.743 [2024-12-17 01:23:34.496673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.496687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.743 [2024-12-17 01:23:34.496744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.496757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.743 #26 NEW cov: 12412 ft: 14562 corp: 13/324b lim: 35 exec/s: 0 rss: 72Mb L: 33/34 MS: 1 ChangeByte- 00:07:48.743 [2024-12-17 01:23:34.556614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.556641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.743 [2024-12-17 01:23:34.556699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.556713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.743 [2024-12-17 01:23:34.556771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.556785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.743 [2024-12-17 01:23:34.556860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.556874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.743 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:48.743 #27 NEW cov: 12435 ft: 14580 corp: 14/357b lim: 35 exec/s: 0 rss: 73Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:48.743 [2024-12-17 01:23:34.596391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.743 [2024-12-17 01:23:34.596418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.743 [2024-12-17 01:23:34.596477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5ad20002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.744 [2024-12-17 01:23:34.596491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.744 #28 NEW cov: 12435 ft: 14614 corp: 15/376b lim: 35 exec/s: 0 rss: 73Mb L: 19/34 MS: 1 ChangeByte- 00:07:48.744 [2024-12-17 01:23:34.636545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.744 [2024-12-17 01:23:34.636574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.744 [2024-12-17 01:23:34.636632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:0d5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.744 [2024-12-17 01:23:34.636646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.744 #29 NEW cov: 12435 ft: 14623 corp: 16/395b lim: 35 exec/s: 0 rss: 73Mb L: 19/34 MS: 1 ChangeByte- 00:07:48.744 [2024-12-17 01:23:34.676989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bfffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.744 [2024-12-17 01:23:34.677015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.744 [2024-12-17 01:23:34.677071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff00ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.744 [2024-12-17 01:23:34.677084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.744 [2024-12-17 01:23:34.677140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.744 [2024-12-17 01:23:34.677154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.744 [2024-12-17 01:23:34.677209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffb3ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.744 [2024-12-17 01:23:34.677222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.744 #30 NEW cov: 12435 ft: 14646 corp: 17/428b lim: 35 exec/s: 30 rss: 73Mb L: 33/34 MS: 1 CrossOver- 00:07:48.744 [2024-12-17 01:23:34.716593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.744 [2024-12-17 01:23:34.716620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.002 #31 NEW cov: 12435 ft: 15429 corp: 18/441b lim: 35 exec/s: 31 rss: 73Mb L: 13/34 MS: 1 CrossOver- 00:07:49.002 [2024-12-17 01:23:34.777429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bfffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.002 [2024-12-17 01:23:34.777455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.777514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.777528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.777587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.777601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.777655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.777668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.777724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffff23 cdw11:ff2f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.777740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.003 #32 NEW cov: 12435 ft: 15501 corp: 19/476b lim: 35 exec/s: 32 rss: 73Mb L: 35/35 MS: 1 CopyPart- 00:07:49.003 [2024-12-17 01:23:34.817367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffdf cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.817393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.817454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.817467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.817526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.817540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.817614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.817628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.003 #33 NEW cov: 12435 ft: 15544 corp: 20/510b lim: 35 exec/s: 33 rss: 73Mb L: 34/35 MS: 1 ChangeBit- 00:07:49.003 [2024-12-17 01:23:34.857145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.857171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.857229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.857244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.003 #34 NEW cov: 12435 ft: 15558 corp: 21/527b lim: 35 exec/s: 34 rss: 73Mb L: 17/35 MS: 1 EraseBytes- 00:07:49.003 [2024-12-17 01:23:34.917333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.917358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.917414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5ad20002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.917428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.003 #35 NEW cov: 12435 ft: 15559 corp: 22/546b lim: 35 exec/s: 35 rss: 73Mb L: 19/35 MS: 1 CopyPart- 00:07:49.003 [2024-12-17 01:23:34.957949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.957974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.958033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:0d5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.958047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.958102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:63636363 cdw11:63630002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.958119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.958175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:63636363 cdw11:63630002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.958189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.958246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:5a5a6363 cdw11:5a5a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.958259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.003 #36 NEW cov: 12435 ft: 15599 corp: 23/581b lim: 35 exec/s: 36 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:49.003 [2024-12-17 01:23:34.997535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:d55a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.997560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.003 [2024-12-17 01:23:34.997615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.003 [2024-12-17 01:23:34.997628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.270 #37 NEW cov: 12435 ft: 15632 corp: 24/599b lim: 35 exec/s: 37 rss: 73Mb L: 18/35 MS: 1 InsertByte- 00:07:49.270 [2024-12-17 01:23:35.057749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.270 [2024-12-17 01:23:35.057774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.270 [2024-12-17 01:23:35.057832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.270 [2024-12-17 01:23:35.057846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.270 #38 NEW cov: 12435 ft: 15653 corp: 25/618b lim: 35 exec/s: 38 rss: 73Mb L: 19/35 MS: 1 ShuffleBytes- 00:07:49.270 [2024-12-17 01:23:35.118221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffdf cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.270 [2024-12-17 01:23:35.118247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.270 [2024-12-17 01:23:35.118302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.270 [2024-12-17 01:23:35.118315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.270 [2024-12-17 01:23:35.118371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.270 [2024-12-17 01:23:35.118400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.270 [2024-12-17 01:23:35.118456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.270 [2024-12-17 01:23:35.118469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.270 #39 NEW cov: 12435 ft: 15706 corp: 26/652b lim: 35 exec/s: 39 rss: 73Mb L: 34/35 MS: 1 CopyPart- 00:07:49.270 [2024-12-17 01:23:35.178093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bfffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.270 [2024-12-17 01:23:35.178119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.270 [2024-12-17 01:23:35.178172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.270 [2024-12-17 01:23:35.178185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.270 #40 NEW cov: 12435 ft: 15734 corp: 27/670b lim: 35 exec/s: 40 rss: 73Mb L: 18/35 MS: 1 EraseBytes- 00:07:49.270 [2024-12-17 01:23:35.218498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.271 [2024-12-17 01:23:35.218522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.271 [2024-12-17 01:23:35.218579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.271 [2024-12-17 01:23:35.218593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.271 [2024-12-17 01:23:35.218648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.271 [2024-12-17 01:23:35.218661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.271 [2024-12-17 01:23:35.218717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffff7dff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.271 [2024-12-17 01:23:35.218730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.271 #41 NEW cov: 12435 ft: 15757 corp: 28/704b lim: 35 exec/s: 41 rss: 73Mb L: 34/35 MS: 1 InsertByte- 00:07:49.271 [2024-12-17 01:23:35.258145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff01ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.271 [2024-12-17 01:23:35.258171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.531 #45 NEW cov: 12435 ft: 15778 corp: 29/712b lim: 35 exec/s: 45 rss: 73Mb L: 8/35 MS: 4 ChangeByte-ChangeBinInt-ShuffleBytes-CrossOver- 00:07:49.531 [2024-12-17 01:23:35.298231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.531 [2024-12-17 01:23:35.298257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.531 #46 NEW cov: 12435 ft: 15835 corp: 30/724b lim: 35 exec/s: 46 rss: 73Mb L: 12/35 MS: 1 EraseBytes- 00:07:49.531 [2024-12-17 01:23:35.338726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.531 [2024-12-17 01:23:35.338754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.531 [2024-12-17 01:23:35.338806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.531 [2024-12-17 01:23:35.338820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.531 [2024-12-17 01:23:35.338884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.531 [2024-12-17 01:23:35.338901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.531 #47 NEW cov: 12435 ft: 15840 corp: 31/749b lim: 35 exec/s: 47 rss: 73Mb L: 25/35 MS: 1 InsertRepeatedBytes- 00:07:49.531 [2024-12-17 01:23:35.378491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff5bff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.531 [2024-12-17 01:23:35.378517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.531 #48 NEW cov: 12435 ft: 15873 corp: 32/757b lim: 35 exec/s: 48 rss: 73Mb L: 8/35 MS: 1 ChangeByte- 00:07:49.531 [2024-12-17 01:23:35.438803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5afe5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.531 [2024-12-17 01:23:35.438845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.531 [2024-12-17 01:23:35.438900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.531 [2024-12-17 01:23:35.438914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.531 #49 NEW cov: 12435 ft: 15879 corp: 33/777b lim: 35 exec/s: 49 rss: 73Mb L: 20/35 MS: 1 InsertByte- 00:07:49.531 [2024-12-17 01:23:35.478743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.531 [2024-12-17 01:23:35.478768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.531 #50 NEW cov: 12435 ft: 15946 corp: 34/790b lim: 35 exec/s: 50 rss: 73Mb L: 13/35 MS: 1 EraseBytes- 00:07:49.789 [2024-12-17 01:23:35.539478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.789 [2024-12-17 01:23:35.539504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.789 [2024-12-17 01:23:35.539558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.789 [2024-12-17 01:23:35.539572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.789 [2024-12-17 01:23:35.539627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.789 [2024-12-17 01:23:35.539641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.789 [2024-12-17 01:23:35.539694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000ff01 cdw11:01ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.789 [2024-12-17 01:23:35.539707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.789 #51 NEW cov: 12435 ft: 15955 corp: 35/823b lim: 35 exec/s: 51 rss: 73Mb L: 33/35 MS: 1 ChangeBinInt- 00:07:49.789 [2024-12-17 01:23:35.579518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:dfff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.789 [2024-12-17 01:23:35.579545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.789 [2024-12-17 01:23:35.579604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.789 [2024-12-17 01:23:35.579618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.790 [2024-12-17 01:23:35.579678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.790 [2024-12-17 01:23:35.579692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.790 [2024-12-17 01:23:35.579750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.790 [2024-12-17 01:23:35.579762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.790 #52 NEW cov: 12435 ft: 15989 corp: 36/857b lim: 35 exec/s: 52 rss: 73Mb L: 34/35 MS: 1 ChangeBit- 00:07:49.790 [2024-12-17 01:23:35.619127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0701ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.790 [2024-12-17 01:23:35.619152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.790 #53 NEW cov: 12435 ft: 15996 corp: 37/865b lim: 35 exec/s: 53 rss: 73Mb L: 8/35 MS: 1 ChangeBinInt- 00:07:49.790 [2024-12-17 01:23:35.659378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.790 [2024-12-17 01:23:35.659402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.790 [2024-12-17 01:23:35.659458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:a0a50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.790 [2024-12-17 01:23:35.659472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.790 #54 NEW cov: 12435 ft: 16035 corp: 38/884b lim: 35 exec/s: 27 rss: 73Mb L: 19/35 MS: 1 ChangeBinInt- 00:07:49.790 #54 DONE cov: 12435 ft: 16035 corp: 38/884b lim: 35 exec/s: 27 rss: 73Mb 00:07:49.790 Done 54 runs in 2 second(s) 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:50.048 01:23:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:07:50.048 [2024-12-17 01:23:35.864135] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:50.048 [2024-12-17 01:23:35.864205] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid825305 ] 00:07:50.306 [2024-12-17 01:23:36.121442] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.306 [2024-12-17 01:23:36.151230] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.306 [2024-12-17 01:23:36.203641] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.306 [2024-12-17 01:23:36.219986] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:50.306 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.306 INFO: Seed: 4210190626 00:07:50.306 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:50.306 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:50.306 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:50.306 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.306 #2 INITED exec/s: 0 rss: 64Mb 00:07:50.306 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.306 This may also happen if the target rejected all inputs we tried so far 00:07:50.306 [2024-12-17 01:23:36.266006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.306 [2024-12-17 01:23:36.266035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.306 [2024-12-17 01:23:36.266092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.306 [2024-12-17 01:23:36.266106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.306 [2024-12-17 01:23:36.266159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.306 [2024-12-17 01:23:36.266172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.306 [2024-12-17 01:23:36.266226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.306 [2024-12-17 01:23:36.266239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.306 [2024-12-17 01:23:36.266294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.306 [2024-12-17 01:23:36.266307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.564 NEW_FUNC[1/714]: 0x45aab8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:50.564 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.564 #4 NEW cov: 12197 ft: 12217 corp: 2/46b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:50.823 [2024-12-17 01:23:36.586869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:867a8686 cdw11:7c860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.823 [2024-12-17 01:23:36.586905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.823 [2024-12-17 01:23:36.586962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.823 [2024-12-17 01:23:36.586976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.823 [2024-12-17 01:23:36.587031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.823 [2024-12-17 01:23:36.587044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.823 [2024-12-17 01:23:36.587098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.823 [2024-12-17 01:23:36.587111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.823 [2024-12-17 01:23:36.587166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.823 [2024-12-17 01:23:36.587179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.823 NEW_FUNC[1/1]: 0x1c10198 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:589 00:07:50.823 #5 NEW cov: 12332 ft: 12801 corp: 3/91b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:50.823 [2024-12-17 01:23:36.646968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.823 [2024-12-17 01:23:36.646998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.823 [2024-12-17 01:23:36.647052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.823 [2024-12-17 01:23:36.647066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.823 [2024-12-17 01:23:36.647122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.823 [2024-12-17 01:23:36.647136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.647189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.647201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.647255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.647268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.824 #6 NEW cov: 12338 ft: 13111 corp: 4/136b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 CopyPart- 00:07:50.824 [2024-12-17 01:23:36.687096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.687123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.687179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.687195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.687249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.687262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.687315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.687328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.687381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.687395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.824 #7 NEW cov: 12423 ft: 13365 corp: 5/181b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeBit- 00:07:50.824 [2024-12-17 01:23:36.747267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.747294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.747349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.747364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.747418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.747431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.747485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.747498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.824 [2024-12-17 01:23:36.747550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86864a86 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.747563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.824 #8 NEW cov: 12423 ft: 13430 corp: 6/226b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeByte- 00:07:50.824 [2024-12-17 01:23:36.806732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.824 [2024-12-17 01:23:36.806758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.083 #13 NEW cov: 12423 ft: 14412 corp: 7/238b lim: 45 exec/s: 0 rss: 72Mb L: 12/45 MS: 5 ChangeByte-ShuffleBytes-CrossOver-CopyPart-CrossOver- 00:07:51.083 [2024-12-17 01:23:36.847474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.847500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.847554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.847571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.847624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00002d00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.847653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.847710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.847723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.847777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.847795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.083 #14 NEW cov: 12423 ft: 14517 corp: 8/283b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:51.083 [2024-12-17 01:23:36.887602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.887628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.887684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.887698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.887752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.887766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.887821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.887835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.887889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.887903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.083 #15 NEW cov: 12423 ft: 14545 corp: 9/328b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:51.083 [2024-12-17 01:23:36.927704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.927730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.927788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.927806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.927860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86210004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.927876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.927931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.927944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.927999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.928012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.083 #16 NEW cov: 12423 ft: 14567 corp: 10/373b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeByte- 00:07:51.083 [2024-12-17 01:23:36.967640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.967666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.967724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.967738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.967796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.967810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:36.967864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:36.967877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.083 #17 NEW cov: 12423 ft: 14602 corp: 11/413b lim: 45 exec/s: 0 rss: 72Mb L: 40/45 MS: 1 EraseBytes- 00:07:51.083 [2024-12-17 01:23:37.007982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.008008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:37.008065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.008080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:37.008135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.008149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:37.008204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:81860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.008218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:37.008273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.008286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.083 #18 NEW cov: 12423 ft: 14620 corp: 12/458b lim: 45 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:51.083 [2024-12-17 01:23:37.068164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.068189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:37.068247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.068260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:37.068314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.068328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:37.068383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.068395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.083 [2024-12-17 01:23:37.068448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86864a86 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.083 [2024-12-17 01:23:37.068461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.342 #19 NEW cov: 12423 ft: 14658 corp: 13/503b lim: 45 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ChangeBit- 00:07:51.342 [2024-12-17 01:23:37.128022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.128047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.342 [2024-12-17 01:23:37.128105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.128119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.342 [2024-12-17 01:23:37.128174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:4a868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.128204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.342 #20 NEW cov: 12423 ft: 14908 corp: 14/532b lim: 45 exec/s: 0 rss: 73Mb L: 29/45 MS: 1 EraseBytes- 00:07:51.342 [2024-12-17 01:23:37.168419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.168445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.342 [2024-12-17 01:23:37.168501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.168515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.342 [2024-12-17 01:23:37.168572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00002d00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.168586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.342 [2024-12-17 01:23:37.168643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.168656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.342 [2024-12-17 01:23:37.168710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.168724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.342 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:51.342 #21 NEW cov: 12446 ft: 14940 corp: 15/577b lim: 45 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:51.342 [2024-12-17 01:23:37.228596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.228622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.342 [2024-12-17 01:23:37.228677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.342 [2024-12-17 01:23:37.228691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.343 [2024-12-17 01:23:37.228747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.228761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.343 [2024-12-17 01:23:37.228817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86828686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.228831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.343 [2024-12-17 01:23:37.228884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86864a86 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.228897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.343 #22 NEW cov: 12446 ft: 14950 corp: 16/622b lim: 45 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ChangeBit- 00:07:51.343 [2024-12-17 01:23:37.268511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.268536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.343 [2024-12-17 01:23:37.268591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86210004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.268604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.343 [2024-12-17 01:23:37.268658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.268671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.343 [2024-12-17 01:23:37.268725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.268738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.343 #23 NEW cov: 12446 ft: 14989 corp: 17/658b lim: 45 exec/s: 23 rss: 73Mb L: 36/45 MS: 1 EraseBytes- 00:07:51.343 [2024-12-17 01:23:37.328559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.328585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.343 [2024-12-17 01:23:37.328641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.328655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.343 [2024-12-17 01:23:37.328710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.343 [2024-12-17 01:23:37.328723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.602 #24 NEW cov: 12446 ft: 15033 corp: 18/692b lim: 45 exec/s: 24 rss: 73Mb L: 34/45 MS: 1 CrossOver- 00:07:51.602 [2024-12-17 01:23:37.368981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:2d000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.602 [2024-12-17 01:23:37.369007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.602 [2024-12-17 01:23:37.369064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.602 [2024-12-17 01:23:37.369078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.602 [2024-12-17 01:23:37.369132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.602 [2024-12-17 01:23:37.369146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.602 [2024-12-17 01:23:37.369203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.602 [2024-12-17 01:23:37.369217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.602 [2024-12-17 01:23:37.369273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86864a86 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.602 [2024-12-17 01:23:37.369288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.602 #25 NEW cov: 12446 ft: 15034 corp: 19/737b lim: 45 exec/s: 25 rss: 73Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:51.602 [2024-12-17 01:23:37.409106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868630 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.602 [2024-12-17 01:23:37.409132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.409190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.409204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.409261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.409291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.409350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.409364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.409419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.409433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.603 #26 NEW cov: 12446 ft: 15111 corp: 20/782b lim: 45 exec/s: 26 rss: 73Mb L: 45/45 MS: 1 ChangeByte- 00:07:51.603 [2024-12-17 01:23:37.449234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.449260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.449318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.449331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.449387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.449400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.449455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.449468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.449521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:864a0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.449535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.603 #27 NEW cov: 12446 ft: 15137 corp: 21/827b lim: 45 exec/s: 27 rss: 73Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:51.603 [2024-12-17 01:23:37.489355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.489383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.489442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:7a868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.489456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.489513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86210004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.489527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.489586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.489599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.489656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.489672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.603 #28 NEW cov: 12446 ft: 15167 corp: 22/872b lim: 45 exec/s: 28 rss: 73Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:51.603 [2024-12-17 01:23:37.529347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.529374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.529434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.529449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.529509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.529524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.529546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.529559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.603 #29 NEW cov: 12446 ft: 15282 corp: 23/908b lim: 45 exec/s: 29 rss: 73Mb L: 36/45 MS: 1 ShuffleBytes- 00:07:51.603 [2024-12-17 01:23:37.589316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.589342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.589398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:21860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.589412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.603 [2024-12-17 01:23:37.589467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.603 [2024-12-17 01:23:37.589481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.862 #30 NEW cov: 12446 ft: 15316 corp: 24/943b lim: 45 exec/s: 30 rss: 73Mb L: 35/45 MS: 1 EraseBytes- 00:07:51.862 [2024-12-17 01:23:37.629543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86968686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.629569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.862 [2024-12-17 01:23:37.629628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86210004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.629641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.862 [2024-12-17 01:23:37.629696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.629710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.862 [2024-12-17 01:23:37.629767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.629784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.862 #31 NEW cov: 12446 ft: 15351 corp: 25/979b lim: 45 exec/s: 31 rss: 73Mb L: 36/45 MS: 1 ChangeBit- 00:07:51.862 [2024-12-17 01:23:37.669852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86862986 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.669878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.862 [2024-12-17 01:23:37.669934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.669948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.862 [2024-12-17 01:23:37.670003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00002d00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.670016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.862 [2024-12-17 01:23:37.670072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.670085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.862 [2024-12-17 01:23:37.670140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.862 [2024-12-17 01:23:37.670153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.862 #32 NEW cov: 12446 ft: 15360 corp: 26/1024b lim: 45 exec/s: 32 rss: 73Mb L: 45/45 MS: 1 ChangeByte- 00:07:51.862 [2024-12-17 01:23:37.730034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.730060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.730115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86068686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.730128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.730182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.730196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.730253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.730266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.730323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86864a86 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.730337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.863 #33 NEW cov: 12446 ft: 15370 corp: 27/1069b lim: 45 exec/s: 33 rss: 73Mb L: 45/45 MS: 1 ChangeBit- 00:07:51.863 [2024-12-17 01:23:37.790206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:867a8686 cdw11:7c860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.790235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.790291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.790304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.790360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.790374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.790428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.790441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.790497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.790511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.863 #34 NEW cov: 12446 ft: 15384 corp: 28/1114b lim: 45 exec/s: 34 rss: 73Mb L: 45/45 MS: 1 ChangeByte- 00:07:51.863 [2024-12-17 01:23:37.850376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.850401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.850459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.850473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.850529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.850543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.850598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b5868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.850611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.863 [2024-12-17 01:23:37.850668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.863 [2024-12-17 01:23:37.850682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.122 #35 NEW cov: 12446 ft: 15388 corp: 29/1159b lim: 45 exec/s: 35 rss: 73Mb L: 45/45 MS: 1 ChangeByte- 00:07:52.122 [2024-12-17 01:23:37.890448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.890475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:37.890533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.890549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:37.890606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:867c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.890620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:37.890673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.890686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:37.890739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86864a86 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.890753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.122 #36 NEW cov: 12446 ft: 15399 corp: 30/1204b lim: 45 exec/s: 36 rss: 73Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:52.122 [2024-12-17 01:23:37.929934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.929959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.122 #37 NEW cov: 12446 ft: 15400 corp: 31/1221b lim: 45 exec/s: 37 rss: 74Mb L: 17/45 MS: 1 EraseBytes- 00:07:52.122 [2024-12-17 01:23:37.990592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868630 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.990617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:37.990673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.990687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:37.990742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.990756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:37.990810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:37.990823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.122 #38 NEW cov: 12446 ft: 15412 corp: 32/1260b lim: 45 exec/s: 38 rss: 74Mb L: 39/45 MS: 1 CrossOver- 00:07:52.122 [2024-12-17 01:23:38.050881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868630 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.050907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:38.050960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.050974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:38.051027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.051045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:38.051101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.051115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:38.051167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.051180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.122 #39 NEW cov: 12446 ft: 15430 corp: 33/1305b lim: 45 exec/s: 39 rss: 74Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:52.122 [2024-12-17 01:23:38.091018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.091043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:38.091099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.091113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:38.091167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00002d00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.091180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:38.091231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.091245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.122 [2024-12-17 01:23:38.091300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.122 [2024-12-17 01:23:38.091312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.122 #40 NEW cov: 12446 ft: 15466 corp: 34/1350b lim: 45 exec/s: 40 rss: 74Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:52.381 [2024-12-17 01:23:38.131166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.131191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.381 [2024-12-17 01:23:38.131245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.131258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.381 [2024-12-17 01:23:38.131311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00002d00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.131324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.381 [2024-12-17 01:23:38.131376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.131389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.381 [2024-12-17 01:23:38.131446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:7a728686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.131458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.381 #41 NEW cov: 12446 ft: 15471 corp: 35/1395b lim: 45 exec/s: 41 rss: 74Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:52.381 [2024-12-17 01:23:38.171291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.171316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.381 [2024-12-17 01:23:38.171371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.171385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.381 [2024-12-17 01:23:38.171440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.171454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.381 [2024-12-17 01:23:38.171506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86300004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.171519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.381 [2024-12-17 01:23:38.171572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.171585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.381 #42 NEW cov: 12446 ft: 15476 corp: 36/1440b lim: 45 exec/s: 42 rss: 74Mb L: 45/45 MS: 1 ChangeByte- 00:07:52.381 [2024-12-17 01:23:38.211393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.381 [2024-12-17 01:23:38.211419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.382 [2024-12-17 01:23:38.211472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.211485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.382 [2024-12-17 01:23:38.211538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00002d00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.211551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.382 [2024-12-17 01:23:38.211606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.211619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.382 [2024-12-17 01:23:38.211671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.211685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.382 #43 NEW cov: 12446 ft: 15524 corp: 37/1485b lim: 45 exec/s: 43 rss: 74Mb L: 45/45 MS: 1 ChangeBit- 00:07:52.382 [2024-12-17 01:23:38.271589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.271615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.382 [2024-12-17 01:23:38.271670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.271684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.382 [2024-12-17 01:23:38.271738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00002d00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.271751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.382 [2024-12-17 01:23:38.271800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.271813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.382 [2024-12-17 01:23:38.271866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:86868686 cdw11:86860004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.382 [2024-12-17 01:23:38.271880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.382 #44 NEW cov: 12446 ft: 15528 corp: 38/1530b lim: 45 exec/s: 22 rss: 74Mb L: 45/45 MS: 1 ChangeByte- 00:07:52.382 #44 DONE cov: 12446 ft: 15528 corp: 38/1530b lim: 45 exec/s: 22 rss: 74Mb 00:07:52.382 Done 44 runs in 2 second(s) 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:52.641 01:23:38 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:07:52.641 [2024-12-17 01:23:38.454244] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:52.641 [2024-12-17 01:23:38.454324] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid825830 ] 00:07:52.900 [2024-12-17 01:23:38.701241] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.900 [2024-12-17 01:23:38.729385] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.900 [2024-12-17 01:23:38.781739] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.900 [2024-12-17 01:23:38.798086] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:52.900 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.900 INFO: Seed: 2493197911 00:07:52.900 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:52.900 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:52.900 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:52.900 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.900 #2 INITED exec/s: 0 rss: 64Mb 00:07:52.900 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.900 This may also happen if the target rejected all inputs we tried so far 00:07:52.900 [2024-12-17 01:23:38.843551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:52.900 [2024-12-17 01:23:38.843579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.900 [2024-12-17 01:23:38.843630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:52.900 [2024-12-17 01:23:38.843643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.900 [2024-12-17 01:23:38.843692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:52.900 [2024-12-17 01:23:38.843706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.158 NEW_FUNC[1/713]: 0x45d2c8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:53.158 NEW_FUNC[2/713]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.158 #3 NEW cov: 12132 ft: 12134 corp: 2/7b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:53.417 [2024-12-17 01:23:39.174186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.174219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.417 #5 NEW cov: 12249 ft: 13049 corp: 3/10b lim: 10 exec/s: 0 rss: 72Mb L: 3/6 MS: 2 ChangeByte-CMP- DE: "\001\000"- 00:07:53.417 [2024-12-17 01:23:39.214457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.214483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.214530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.214544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.214592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008960 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.214609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.417 #6 NEW cov: 12255 ft: 13166 corp: 4/17b lim: 10 exec/s: 0 rss: 72Mb L: 7/7 MS: 1 InsertByte- 00:07:53.417 [2024-12-17 01:23:39.274618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.274643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.274694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.274707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.274754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.274768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.417 #7 NEW cov: 12340 ft: 13431 corp: 5/23b lim: 10 exec/s: 0 rss: 72Mb L: 6/7 MS: 1 ShuffleBytes- 00:07:53.417 [2024-12-17 01:23:39.314839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000600a cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.314865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.314914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.314927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.314976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006089 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.314990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.315037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.315050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.417 #8 NEW cov: 12340 ft: 13687 corp: 6/31b lim: 10 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 CopyPart- 00:07:53.417 [2024-12-17 01:23:39.374771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.374800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.374865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008960 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.374879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.417 #10 NEW cov: 12340 ft: 13925 corp: 7/36b lim: 10 exec/s: 0 rss: 72Mb L: 5/8 MS: 2 ShuffleBytes-CrossOver- 00:07:53.417 [2024-12-17 01:23:39.415010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.415036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.415086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.415099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.417 [2024-12-17 01:23:39.415147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.417 [2024-12-17 01:23:39.415165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.675 #11 NEW cov: 12340 ft: 13982 corp: 8/42b lim: 10 exec/s: 0 rss: 72Mb L: 6/8 MS: 1 ShuffleBytes- 00:07:53.675 [2024-12-17 01:23:39.475300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.675 [2024-12-17 01:23:39.475326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.675 [2024-12-17 01:23:39.475376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.675 [2024-12-17 01:23:39.475389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.675 [2024-12-17 01:23:39.475439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006089 cdw11:00000000 00:07:53.675 [2024-12-17 01:23:39.475453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.675 [2024-12-17 01:23:39.475503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.475516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.676 #12 NEW cov: 12340 ft: 13999 corp: 9/50b lim: 10 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 CopyPart- 00:07:53.676 [2024-12-17 01:23:39.515302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.515327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.676 [2024-12-17 01:23:39.515380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000089ff cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.515393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.676 [2024-12-17 01:23:39.515443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.515457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.676 #13 NEW cov: 12340 ft: 14089 corp: 10/56b lim: 10 exec/s: 0 rss: 72Mb L: 6/8 MS: 1 InsertByte- 00:07:53.676 [2024-12-17 01:23:39.575553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.575578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.676 [2024-12-17 01:23:39.575629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.575641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.676 [2024-12-17 01:23:39.575690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.575704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.676 [2024-12-17 01:23:39.575754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.575766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.676 #14 NEW cov: 12340 ft: 14161 corp: 11/64b lim: 10 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:53.676 [2024-12-17 01:23:39.635512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.635540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.676 [2024-12-17 01:23:39.635592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000160 cdw11:00000000 00:07:53.676 [2024-12-17 01:23:39.635605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.676 #15 NEW cov: 12340 ft: 14218 corp: 12/68b lim: 10 exec/s: 0 rss: 72Mb L: 4/8 MS: 1 EraseBytes- 00:07:53.934 [2024-12-17 01:23:39.695915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.934 [2024-12-17 01:23:39.695942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.934 [2024-12-17 01:23:39.695994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.934 [2024-12-17 01:23:39.696008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.934 [2024-12-17 01:23:39.696056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.934 [2024-12-17 01:23:39.696070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.934 [2024-12-17 01:23:39.696118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.934 [2024-12-17 01:23:39.696131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.934 #16 NEW cov: 12340 ft: 14276 corp: 13/77b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CopyPart- 00:07:53.934 [2024-12-17 01:23:39.735684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0e cdw11:00000000 00:07:53.934 [2024-12-17 01:23:39.735710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.934 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:53.934 #18 NEW cov: 12363 ft: 14322 corp: 14/79b lim: 10 exec/s: 0 rss: 73Mb L: 2/9 MS: 2 ChangeBit-CopyPart- 00:07:53.934 [2024-12-17 01:23:39.776257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a71 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.776283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.935 [2024-12-17 01:23:39.776334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007171 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.776348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.935 [2024-12-17 01:23:39.776397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007160 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.776411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.935 [2024-12-17 01:23:39.776460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.776472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.935 [2024-12-17 01:23:39.776523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.776537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.935 #19 NEW cov: 12363 ft: 14366 corp: 15/89b lim: 10 exec/s: 0 rss: 73Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:53.935 [2024-12-17 01:23:39.815936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.815962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.935 #20 NEW cov: 12363 ft: 14385 corp: 16/92b lim: 10 exec/s: 20 rss: 73Mb L: 3/10 MS: 1 CrossOver- 00:07:53.935 [2024-12-17 01:23:39.876067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.876093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.935 #21 NEW cov: 12363 ft: 14405 corp: 17/95b lim: 10 exec/s: 21 rss: 73Mb L: 3/10 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:53.935 [2024-12-17 01:23:39.916529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.916554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.935 [2024-12-17 01:23:39.916604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.916617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.935 [2024-12-17 01:23:39.916668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006001 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.916682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.935 [2024-12-17 01:23:39.916730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000060 cdw11:00000000 00:07:53.935 [2024-12-17 01:23:39.916743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.194 #22 NEW cov: 12363 ft: 14498 corp: 18/104b lim: 10 exec/s: 22 rss: 73Mb L: 9/10 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:54.194 [2024-12-17 01:23:39.956414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:39.956439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.194 [2024-12-17 01:23:39.956490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:39.956504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.194 #23 NEW cov: 12363 ft: 14522 corp: 19/109b lim: 10 exec/s: 23 rss: 73Mb L: 5/10 MS: 1 EraseBytes- 00:07:54.194 [2024-12-17 01:23:39.996414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007501 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:39.996439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.194 #24 NEW cov: 12363 ft: 14535 corp: 20/112b lim: 10 exec/s: 24 rss: 73Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:54.194 [2024-12-17 01:23:40.057112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a71 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.057145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.194 [2024-12-17 01:23:40.057199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007171 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.057213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.194 [2024-12-17 01:23:40.057266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007171 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.057284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.194 [2024-12-17 01:23:40.057334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.057347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.194 [2024-12-17 01:23:40.057399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.057413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.194 #25 NEW cov: 12363 ft: 14566 corp: 21/122b lim: 10 exec/s: 25 rss: 73Mb L: 10/10 MS: 1 CopyPart- 00:07:54.194 [2024-12-17 01:23:40.116942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.116974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.194 [2024-12-17 01:23:40.117028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.117041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.194 #26 NEW cov: 12363 ft: 14602 corp: 22/126b lim: 10 exec/s: 26 rss: 73Mb L: 4/10 MS: 1 EraseBytes- 00:07:54.194 [2024-12-17 01:23:40.157022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007501 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.157048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.194 [2024-12-17 01:23:40.157099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 00:07:54.194 [2024-12-17 01:23:40.157113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.452 #27 NEW cov: 12363 ft: 14610 corp: 23/131b lim: 10 exec/s: 27 rss: 73Mb L: 5/10 MS: 1 CMP- DE: "\004\000"- 00:07:54.452 [2024-12-17 01:23:40.217058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:07:54.452 [2024-12-17 01:23:40.217084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.452 #28 NEW cov: 12363 ft: 14629 corp: 24/134b lim: 10 exec/s: 28 rss: 73Mb L: 3/10 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:54.452 [2024-12-17 01:23:40.277461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000600a cdw11:00000000 00:07:54.452 [2024-12-17 01:23:40.277486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.452 [2024-12-17 01:23:40.277539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.277553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.277605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006089 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.277619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.453 #29 NEW cov: 12363 ft: 14636 corp: 25/141b lim: 10 exec/s: 29 rss: 73Mb L: 7/10 MS: 1 EraseBytes- 00:07:54.453 [2024-12-17 01:23:40.337783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.337817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.337875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.337889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.337941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.337955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.338007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.338020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.453 #30 NEW cov: 12363 ft: 14637 corp: 26/149b lim: 10 exec/s: 30 rss: 73Mb L: 8/10 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:54.453 [2024-12-17 01:23:40.377672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.377698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.377751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000125 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.377764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.453 #31 NEW cov: 12363 ft: 14640 corp: 27/154b lim: 10 exec/s: 31 rss: 73Mb L: 5/10 MS: 1 InsertByte- 00:07:54.453 [2024-12-17 01:23:40.438147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.438173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.438225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006071 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.438239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.438291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007171 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.438305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.438358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007160 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.438371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.453 [2024-12-17 01:23:40.438420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007160 cdw11:00000000 00:07:54.453 [2024-12-17 01:23:40.438434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.712 #32 NEW cov: 12363 ft: 14664 corp: 28/164b lim: 10 exec/s: 32 rss: 73Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:54.712 [2024-12-17 01:23:40.498077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.498103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.498157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.498171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.498224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008860 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.498240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.712 #33 NEW cov: 12363 ft: 14677 corp: 29/171b lim: 10 exec/s: 33 rss: 73Mb L: 7/10 MS: 1 ChangeBinInt- 00:07:54.712 [2024-12-17 01:23:40.538089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.538114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.538166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b660 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.538180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.712 #34 NEW cov: 12363 ft: 14692 corp: 30/175b lim: 10 exec/s: 34 rss: 73Mb L: 4/10 MS: 1 ChangeByte- 00:07:54.712 [2024-12-17 01:23:40.578328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.578353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.578408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.578432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.578501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e060 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.578515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.712 #35 NEW cov: 12363 ft: 14766 corp: 31/181b lim: 10 exec/s: 35 rss: 73Mb L: 6/10 MS: 1 ChangeBit- 00:07:54.712 [2024-12-17 01:23:40.618572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000060 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.618597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.618651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.618664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.618715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006089 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.618729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.618780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.618798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.712 #36 NEW cov: 12363 ft: 14780 corp: 32/189b lim: 10 exec/s: 36 rss: 74Mb L: 8/10 MS: 1 ChangeByte- 00:07:54.712 [2024-12-17 01:23:40.678541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006008 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.678567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.678618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.678633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.712 [2024-12-17 01:23:40.678683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006089 cdw11:00000000 00:07:54.712 [2024-12-17 01:23:40.678700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.971 #37 NEW cov: 12363 ft: 14797 corp: 33/196b lim: 10 exec/s: 37 rss: 74Mb L: 7/10 MS: 1 ChangeBit- 00:07:54.971 [2024-12-17 01:23:40.738855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000060 cdw11:00000000 00:07:54.971 [2024-12-17 01:23:40.738883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.971 [2024-12-17 01:23:40.738933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006060 cdw11:00000000 00:07:54.971 [2024-12-17 01:23:40.738948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.971 [2024-12-17 01:23:40.738997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00003a60 cdw11:00000000 00:07:54.971 [2024-12-17 01:23:40.739011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.971 [2024-12-17 01:23:40.739059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00008960 cdw11:00000000 00:07:54.971 [2024-12-17 01:23:40.739072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.971 #38 NEW cov: 12363 ft: 14811 corp: 34/205b lim: 10 exec/s: 38 rss: 74Mb L: 9/10 MS: 1 InsertByte- 00:07:54.971 [2024-12-17 01:23:40.798763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007501 cdw11:00000000 00:07:54.971 [2024-12-17 01:23:40.798788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.971 [2024-12-17 01:23:40.798843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:54.971 [2024-12-17 01:23:40.798857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.971 #39 NEW cov: 12363 ft: 14816 corp: 35/209b lim: 10 exec/s: 19 rss: 74Mb L: 4/10 MS: 1 EraseBytes- 00:07:54.971 #39 DONE cov: 12363 ft: 14816 corp: 35/209b lim: 10 exec/s: 19 rss: 74Mb 00:07:54.971 ###### Recommended dictionary. ###### 00:07:54.971 "\001\000" # Uses: 5 00:07:54.971 "\004\000" # Uses: 0 00:07:54.971 ###### End of recommended dictionary. ###### 00:07:54.971 Done 39 runs in 2 second(s) 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:54.971 01:23:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:07:55.229 [2024-12-17 01:23:40.998085] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:55.229 [2024-12-17 01:23:40.998156] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid826366 ] 00:07:55.488 [2024-12-17 01:23:41.249205] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.488 [2024-12-17 01:23:41.279863] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.488 [2024-12-17 01:23:41.332116] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.488 [2024-12-17 01:23:41.348428] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:55.488 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.488 INFO: Seed: 747226990 00:07:55.488 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:55.488 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:55.488 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:55.488 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.488 #2 INITED exec/s: 0 rss: 64Mb 00:07:55.488 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.488 This may also happen if the target rejected all inputs we tried so far 00:07:55.488 [2024-12-17 01:23:41.397368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a5b cdw11:00000000 00:07:55.488 [2024-12-17 01:23:41.397397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.746 NEW_FUNC[1/713]: 0x45dcc8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:55.746 NEW_FUNC[2/713]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.746 #4 NEW cov: 12136 ft: 12133 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 ChangeBit-InsertByte- 00:07:55.746 [2024-12-17 01:23:41.708196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a8a cdw11:00000000 00:07:55.746 [2024-12-17 01:23:41.708233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.004 #6 NEW cov: 12249 ft: 12674 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 EraseBytes-InsertByte- 00:07:56.005 [2024-12-17 01:23:41.768233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a8a cdw11:00000000 00:07:56.005 [2024-12-17 01:23:41.768261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.005 #7 NEW cov: 12255 ft: 13010 corp: 4/7b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CopyPart- 00:07:56.005 [2024-12-17 01:23:41.828373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:56.005 [2024-12-17 01:23:41.828398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.005 #8 NEW cov: 12340 ft: 13288 corp: 5/9b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ChangeByte- 00:07:56.005 [2024-12-17 01:23:41.888523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009aff cdw11:00000000 00:07:56.005 [2024-12-17 01:23:41.888548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.005 #11 NEW cov: 12340 ft: 13468 corp: 6/12b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 3 EraseBytes-ChangeByte-CrossOver- 00:07:56.005 [2024-12-17 01:23:41.928899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009aff cdw11:00000000 00:07:56.005 [2024-12-17 01:23:41.928924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.005 [2024-12-17 01:23:41.928977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:56.005 [2024-12-17 01:23:41.928991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.005 [2024-12-17 01:23:41.929042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:56.005 [2024-12-17 01:23:41.929056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.005 #12 NEW cov: 12340 ft: 13815 corp: 7/18b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:56.005 [2024-12-17 01:23:41.988818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:56.005 [2024-12-17 01:23:41.988844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.263 #13 NEW cov: 12340 ft: 13878 corp: 8/21b lim: 10 exec/s: 0 rss: 72Mb L: 3/6 MS: 1 CopyPart- 00:07:56.263 [2024-12-17 01:23:42.049019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a08a cdw11:00000000 00:07:56.263 [2024-12-17 01:23:42.049045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.263 #14 NEW cov: 12340 ft: 13927 corp: 9/23b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 ChangeByte- 00:07:56.263 [2024-12-17 01:23:42.089136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:56.263 [2024-12-17 01:23:42.089161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.263 #15 NEW cov: 12340 ft: 13989 corp: 10/26b lim: 10 exec/s: 0 rss: 72Mb L: 3/6 MS: 1 ShuffleBytes- 00:07:56.263 [2024-12-17 01:23:42.149269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bf8a cdw11:00000000 00:07:56.263 [2024-12-17 01:23:42.149295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.263 #16 NEW cov: 12340 ft: 14045 corp: 11/29b lim: 10 exec/s: 0 rss: 72Mb L: 3/6 MS: 1 ChangeBit- 00:07:56.263 [2024-12-17 01:23:42.189404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bf8a cdw11:00000000 00:07:56.263 [2024-12-17 01:23:42.189430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.263 #17 NEW cov: 12340 ft: 14090 corp: 12/32b lim: 10 exec/s: 0 rss: 73Mb L: 3/6 MS: 1 ChangeBit- 00:07:56.263 [2024-12-17 01:23:42.249702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:56.263 [2024-12-17 01:23:42.249727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.263 [2024-12-17 01:23:42.249779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:07:56.263 [2024-12-17 01:23:42.249797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.521 #18 NEW cov: 12340 ft: 14259 corp: 13/37b lim: 10 exec/s: 0 rss: 73Mb L: 5/6 MS: 1 InsertRepeatedBytes- 00:07:56.521 [2024-12-17 01:23:42.289915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:56.521 [2024-12-17 01:23:42.289940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.521 [2024-12-17 01:23:42.289992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008ad8 cdw11:00000000 00:07:56.521 [2024-12-17 01:23:42.290006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.521 [2024-12-17 01:23:42.290056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:07:56.521 [2024-12-17 01:23:42.290070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.521 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:56.521 #19 NEW cov: 12363 ft: 14331 corp: 14/43b lim: 10 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 CrossOver- 00:07:56.521 [2024-12-17 01:23:42.329787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a5b cdw11:00000000 00:07:56.521 [2024-12-17 01:23:42.329817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.521 #20 NEW cov: 12363 ft: 14349 corp: 15/45b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 1 ShuffleBytes- 00:07:56.521 [2024-12-17 01:23:42.369967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:56.521 [2024-12-17 01:23:42.369992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.521 #21 NEW cov: 12363 ft: 14364 corp: 16/47b lim: 10 exec/s: 21 rss: 73Mb L: 2/6 MS: 1 ChangeBinInt- 00:07:56.521 [2024-12-17 01:23:42.410016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a8a cdw11:00000000 00:07:56.521 [2024-12-17 01:23:42.410041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.521 #22 NEW cov: 12363 ft: 14407 corp: 17/49b lim: 10 exec/s: 22 rss: 73Mb L: 2/6 MS: 1 CopyPart- 00:07:56.521 [2024-12-17 01:23:42.470290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:56.521 [2024-12-17 01:23:42.470315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.521 [2024-12-17 01:23:42.470366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d8dc cdw11:00000000 00:07:56.521 [2024-12-17 01:23:42.470380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.521 #23 NEW cov: 12363 ft: 14427 corp: 18/54b lim: 10 exec/s: 23 rss: 73Mb L: 5/6 MS: 1 ChangeBinInt- 00:07:56.779 [2024-12-17 01:23:42.530603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff67 cdw11:00000000 00:07:56.779 [2024-12-17 01:23:42.530628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.779 [2024-12-17 01:23:42.530680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008ad8 cdw11:00000000 00:07:56.779 [2024-12-17 01:23:42.530693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.779 [2024-12-17 01:23:42.530744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000dcd8 cdw11:00000000 00:07:56.779 [2024-12-17 01:23:42.530757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.779 #24 NEW cov: 12363 ft: 14443 corp: 19/60b lim: 10 exec/s: 24 rss: 73Mb L: 6/6 MS: 1 InsertByte- 00:07:56.779 [2024-12-17 01:23:42.590552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000e05b cdw11:00000000 00:07:56.779 [2024-12-17 01:23:42.590577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.779 #25 NEW cov: 12363 ft: 14506 corp: 20/62b lim: 10 exec/s: 25 rss: 73Mb L: 2/6 MS: 1 ChangeByte- 00:07:56.779 [2024-12-17 01:23:42.650693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bf90 cdw11:00000000 00:07:56.779 [2024-12-17 01:23:42.650717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.779 #26 NEW cov: 12363 ft: 14523 corp: 21/65b lim: 10 exec/s: 26 rss: 73Mb L: 3/6 MS: 1 ChangeByte- 00:07:56.779 [2024-12-17 01:23:42.710873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008af5 cdw11:00000000 00:07:56.779 [2024-12-17 01:23:42.710897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.779 #27 NEW cov: 12363 ft: 14537 corp: 22/67b lim: 10 exec/s: 27 rss: 73Mb L: 2/6 MS: 1 ChangeByte- 00:07:56.779 [2024-12-17 01:23:42.750985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009afe cdw11:00000000 00:07:56.779 [2024-12-17 01:23:42.751010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.779 #28 NEW cov: 12363 ft: 14550 corp: 23/70b lim: 10 exec/s: 28 rss: 73Mb L: 3/6 MS: 1 ChangeBit- 00:07:57.037 [2024-12-17 01:23:42.791110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cda0 cdw11:00000000 00:07:57.037 [2024-12-17 01:23:42.791136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 #29 NEW cov: 12363 ft: 14578 corp: 24/73b lim: 10 exec/s: 29 rss: 73Mb L: 3/6 MS: 1 InsertByte- 00:07:57.037 [2024-12-17 01:23:42.831386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000328a cdw11:00000000 00:07:57.037 [2024-12-17 01:23:42.831410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 [2024-12-17 01:23:42.831464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d8dc cdw11:00000000 00:07:57.037 [2024-12-17 01:23:42.831477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.037 #30 NEW cov: 12363 ft: 14581 corp: 25/78b lim: 10 exec/s: 30 rss: 73Mb L: 5/6 MS: 1 ChangeByte- 00:07:57.037 [2024-12-17 01:23:42.871334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008b8a cdw11:00000000 00:07:57.037 [2024-12-17 01:23:42.871359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 #31 NEW cov: 12363 ft: 14582 corp: 26/80b lim: 10 exec/s: 31 rss: 73Mb L: 2/6 MS: 1 ChangeBit- 00:07:57.037 [2024-12-17 01:23:42.931486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cda0 cdw11:00000000 00:07:57.037 [2024-12-17 01:23:42.931511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.037 #32 NEW cov: 12363 ft: 14615 corp: 27/82b lim: 10 exec/s: 32 rss: 73Mb L: 2/6 MS: 1 EraseBytes- 00:07:57.037 [2024-12-17 01:23:42.991767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:57.038 [2024-12-17 01:23:42.991795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.038 [2024-12-17 01:23:42.991850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008aff cdw11:00000000 00:07:57.038 [2024-12-17 01:23:42.991863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.038 #33 NEW cov: 12363 ft: 14619 corp: 28/86b lim: 10 exec/s: 33 rss: 74Mb L: 4/6 MS: 1 CopyPart- 00:07:57.296 [2024-12-17 01:23:43.051797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fff5 cdw11:00000000 00:07:57.296 [2024-12-17 01:23:43.051822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 #34 NEW cov: 12363 ft: 14629 corp: 29/88b lim: 10 exec/s: 34 rss: 74Mb L: 2/6 MS: 1 ChangeByte- 00:07:57.296 [2024-12-17 01:23:43.112242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff43 cdw11:00000000 00:07:57.296 [2024-12-17 01:23:43.112267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 [2024-12-17 01:23:43.112319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:57.296 [2024-12-17 01:23:43.112333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.296 [2024-12-17 01:23:43.112386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000043f5 cdw11:00000000 00:07:57.296 [2024-12-17 01:23:43.112400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.296 #35 NEW cov: 12363 ft: 14667 corp: 30/94b lim: 10 exec/s: 35 rss: 74Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:57.296 [2024-12-17 01:23:43.172199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000408a cdw11:00000000 00:07:57.296 [2024-12-17 01:23:43.172224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 #39 NEW cov: 12363 ft: 14675 corp: 31/97b lim: 10 exec/s: 39 rss: 74Mb L: 3/6 MS: 4 EraseBytes-ChangeByte-ShuffleBytes-CrossOver- 00:07:57.296 [2024-12-17 01:23:43.212329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005d8a cdw11:00000000 00:07:57.296 [2024-12-17 01:23:43.212354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.296 #40 NEW cov: 12363 ft: 14725 corp: 32/100b lim: 10 exec/s: 40 rss: 74Mb L: 3/6 MS: 1 ChangeByte- 00:07:57.296 [2024-12-17 01:23:43.252397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bf90 cdw11:00000000 00:07:57.297 [2024-12-17 01:23:43.252422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.297 #41 NEW cov: 12363 ft: 14743 corp: 33/103b lim: 10 exec/s: 41 rss: 74Mb L: 3/6 MS: 1 ChangeBit- 00:07:57.555 [2024-12-17 01:23:43.312823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:57.555 [2024-12-17 01:23:43.312848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.555 [2024-12-17 01:23:43.312901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:07:57.555 [2024-12-17 01:23:43.312914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.555 [2024-12-17 01:23:43.312967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000dd8 cdw11:00000000 00:07:57.555 [2024-12-17 01:23:43.312981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.555 #42 NEW cov: 12363 ft: 14759 corp: 34/109b lim: 10 exec/s: 42 rss: 74Mb L: 6/6 MS: 1 InsertByte- 00:07:57.555 [2024-12-17 01:23:43.352728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000278a cdw11:00000000 00:07:57.555 [2024-12-17 01:23:43.352753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.555 [2024-12-17 01:23:43.392816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000278a cdw11:00000000 00:07:57.555 [2024-12-17 01:23:43.392840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.555 #44 NEW cov: 12363 ft: 14826 corp: 35/112b lim: 10 exec/s: 22 rss: 74Mb L: 3/6 MS: 2 ChangeByte-InsertByte- 00:07:57.555 #44 DONE cov: 12363 ft: 14826 corp: 35/112b lim: 10 exec/s: 22 rss: 74Mb 00:07:57.555 Done 44 runs in 2 second(s) 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:57.555 01:23:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:07:57.814 [2024-12-17 01:23:43.564896] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:57.814 [2024-12-17 01:23:43.564966] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid826657 ] 00:07:57.814 [2024-12-17 01:23:43.744144] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.814 [2024-12-17 01:23:43.765610] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.072 [2024-12-17 01:23:43.818032] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.072 [2024-12-17 01:23:43.834255] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:58.072 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.072 INFO: Seed: 3234255545 00:07:58.072 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:58.072 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:58.072 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:58.072 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.072 [2024-12-17 01:23:43.879722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.072 [2024-12-17 01:23:43.879751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.072 #2 INITED cov: 12162 ft: 12145 corp: 1/1b exec/s: 0 rss: 71Mb 00:07:58.072 [2024-12-17 01:23:43.919716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.072 [2024-12-17 01:23:43.919743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.072 #3 NEW cov: 12277 ft: 12864 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 CrossOver- 00:07:58.072 [2024-12-17 01:23:43.980074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.072 [2024-12-17 01:23:43.980100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.072 [2024-12-17 01:23:43.980154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.072 [2024-12-17 01:23:43.980168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.072 #4 NEW cov: 12283 ft: 13720 corp: 3/4b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CopyPart- 00:07:58.072 [2024-12-17 01:23:44.019999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.072 [2024-12-17 01:23:44.020025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.072 #5 NEW cov: 12368 ft: 14011 corp: 4/5b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 EraseBytes- 00:07:58.331 [2024-12-17 01:23:44.080342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.080368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.331 [2024-12-17 01:23:44.080424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.080438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.331 #6 NEW cov: 12368 ft: 14054 corp: 5/7b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:58.331 [2024-12-17 01:23:44.120251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.120276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.331 #7 NEW cov: 12368 ft: 14176 corp: 6/8b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeByte- 00:07:58.331 [2024-12-17 01:23:44.160530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.160556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.331 [2024-12-17 01:23:44.160611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.160628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.331 #8 NEW cov: 12368 ft: 14216 corp: 7/10b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 InsertByte- 00:07:58.331 [2024-12-17 01:23:44.200807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.200833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.331 [2024-12-17 01:23:44.200889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.200902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.331 [2024-12-17 01:23:44.200954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.200968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.331 #9 NEW cov: 12368 ft: 14421 corp: 8/13b lim: 5 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 InsertByte- 00:07:58.331 [2024-12-17 01:23:44.260667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.260693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.331 #10 NEW cov: 12368 ft: 14444 corp: 9/14b lim: 5 exec/s: 0 rss: 71Mb L: 1/3 MS: 1 EraseBytes- 00:07:58.331 [2024-12-17 01:23:44.320961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.320986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.331 [2024-12-17 01:23:44.321041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.331 [2024-12-17 01:23:44.321054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.590 #11 NEW cov: 12368 ft: 14498 corp: 10/16b lim: 5 exec/s: 0 rss: 71Mb L: 2/3 MS: 1 InsertByte- 00:07:58.590 [2024-12-17 01:23:44.381431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.381456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.381513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.381527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.381581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.381609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.381665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.381678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.590 #12 NEW cov: 12368 ft: 14782 corp: 11/20b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 CrossOver- 00:07:58.590 [2024-12-17 01:23:44.441412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.441438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.441494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.441509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.441563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.441576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.590 #13 NEW cov: 12368 ft: 14833 corp: 12/23b lim: 5 exec/s: 0 rss: 72Mb L: 3/4 MS: 1 ChangeBit- 00:07:58.590 [2024-12-17 01:23:44.481808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.481850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.481905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.481919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.481972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.481987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.482041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.482054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.482105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.482119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.590 #14 NEW cov: 12368 ft: 14898 corp: 13/28b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:58.590 [2024-12-17 01:23:44.522017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.522043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.522099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.522113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.522169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.522183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.522243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.522256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.590 [2024-12-17 01:23:44.522310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.590 [2024-12-17 01:23:44.522324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.590 #15 NEW cov: 12368 ft: 14937 corp: 14/33b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:58.591 [2024-12-17 01:23:44.561468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.591 [2024-12-17 01:23:44.561493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.591 #16 NEW cov: 12368 ft: 15024 corp: 15/34b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:58.849 [2024-12-17 01:23:44.602245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.602271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.849 [2024-12-17 01:23:44.602326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.602341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.849 [2024-12-17 01:23:44.602394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.602408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.849 [2024-12-17 01:23:44.602460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.602473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.849 [2024-12-17 01:23:44.602523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.602535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.849 #17 NEW cov: 12368 ft: 15063 corp: 16/39b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 ChangeByte- 00:07:58.849 [2024-12-17 01:23:44.661748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.661773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.849 #18 NEW cov: 12368 ft: 15076 corp: 17/40b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 EraseBytes- 00:07:58.849 [2024-12-17 01:23:44.722090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.722116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.849 [2024-12-17 01:23:44.722172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.722188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.849 #19 NEW cov: 12368 ft: 15113 corp: 18/42b lim: 5 exec/s: 0 rss: 72Mb L: 2/5 MS: 1 InsertByte- 00:07:58.849 [2024-12-17 01:23:44.782214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.782239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.849 [2024-12-17 01:23:44.782295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.849 [2024-12-17 01:23:44.782308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.108 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:59.108 #20 NEW cov: 12391 ft: 15143 corp: 19/44b lim: 5 exec/s: 20 rss: 73Mb L: 2/5 MS: 1 ChangeByte- 00:07:59.108 [2024-12-17 01:23:45.073241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.108 [2024-12-17 01:23:45.073272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.108 [2024-12-17 01:23:45.073330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.108 [2024-12-17 01:23:45.073344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.108 [2024-12-17 01:23:45.073399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.108 [2024-12-17 01:23:45.073413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.108 #21 NEW cov: 12391 ft: 15178 corp: 20/47b lim: 5 exec/s: 21 rss: 73Mb L: 3/5 MS: 1 InsertByte- 00:07:59.366 [2024-12-17 01:23:45.113622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.113649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.113707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.113721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.113780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.113798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.113855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.113868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.113928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.113946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.366 #22 NEW cov: 12391 ft: 15195 corp: 21/52b lim: 5 exec/s: 22 rss: 73Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:59.366 [2024-12-17 01:23:45.153285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.153312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.153368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.153382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.366 #23 NEW cov: 12391 ft: 15263 corp: 22/54b lim: 5 exec/s: 23 rss: 73Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:59.366 [2024-12-17 01:23:45.193383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.193410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.193468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.193481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.366 #24 NEW cov: 12391 ft: 15277 corp: 23/56b lim: 5 exec/s: 24 rss: 73Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:59.366 [2024-12-17 01:23:45.233831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.233857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.233914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.233928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.233984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.233998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.234052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.234065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.366 #25 NEW cov: 12391 ft: 15288 corp: 24/60b lim: 5 exec/s: 25 rss: 73Mb L: 4/5 MS: 1 EraseBytes- 00:07:59.366 [2024-12-17 01:23:45.293798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.293824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.293881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.293894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.293950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.293969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.366 #26 NEW cov: 12391 ft: 15303 corp: 25/63b lim: 5 exec/s: 26 rss: 73Mb L: 3/5 MS: 1 CopyPart- 00:07:59.366 [2024-12-17 01:23:45.333970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.333997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.334054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.334068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.366 [2024-12-17 01:23:45.334142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.366 [2024-12-17 01:23:45.334156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.624 #27 NEW cov: 12391 ft: 15307 corp: 26/66b lim: 5 exec/s: 27 rss: 73Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:59.624 [2024-12-17 01:23:45.394404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.394431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.394492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.394506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.394564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.394577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.394633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.394646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.394704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.394718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.624 #28 NEW cov: 12391 ft: 15329 corp: 27/71b lim: 5 exec/s: 28 rss: 74Mb L: 5/5 MS: 1 CrossOver- 00:07:59.624 [2024-12-17 01:23:45.454553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.454579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.454637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.454651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.454713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.454727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.454782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.454800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.454855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.454869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.624 #29 NEW cov: 12391 ft: 15343 corp: 28/76b lim: 5 exec/s: 29 rss: 74Mb L: 5/5 MS: 1 CopyPart- 00:07:59.624 [2024-12-17 01:23:45.494257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.494285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.624 [2024-12-17 01:23:45.494347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.494361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.624 #30 NEW cov: 12391 ft: 15354 corp: 29/78b lim: 5 exec/s: 30 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:07:59.624 [2024-12-17 01:23:45.554283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.554310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.624 #31 NEW cov: 12391 ft: 15407 corp: 30/79b lim: 5 exec/s: 31 rss: 74Mb L: 1/5 MS: 1 CrossOver- 00:07:59.624 [2024-12-17 01:23:45.594536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.624 [2024-12-17 01:23:45.594565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.625 [2024-12-17 01:23:45.594624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.625 [2024-12-17 01:23:45.594638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.883 #32 NEW cov: 12391 ft: 15420 corp: 31/81b lim: 5 exec/s: 32 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:07:59.884 [2024-12-17 01:23:45.654818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.654844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.884 [2024-12-17 01:23:45.654900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.654913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.884 [2024-12-17 01:23:45.654968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.654986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.884 #33 NEW cov: 12391 ft: 15425 corp: 32/84b lim: 5 exec/s: 33 rss: 74Mb L: 3/5 MS: 1 CrossOver- 00:07:59.884 [2024-12-17 01:23:45.714844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.714869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.884 [2024-12-17 01:23:45.714926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.714941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.884 #34 NEW cov: 12391 ft: 15433 corp: 33/86b lim: 5 exec/s: 34 rss: 74Mb L: 2/5 MS: 1 ChangeBit- 00:07:59.884 [2024-12-17 01:23:45.755131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.755157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.884 [2024-12-17 01:23:45.755215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.755229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.884 [2024-12-17 01:23:45.755286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.755299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.884 #35 NEW cov: 12391 ft: 15456 corp: 34/89b lim: 5 exec/s: 35 rss: 74Mb L: 3/5 MS: 1 ChangeBit- 00:07:59.884 [2024-12-17 01:23:45.814973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.814999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.884 #36 NEW cov: 12391 ft: 15469 corp: 35/90b lim: 5 exec/s: 36 rss: 74Mb L: 1/5 MS: 1 CopyPart- 00:07:59.884 [2024-12-17 01:23:45.875127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.884 [2024-12-17 01:23:45.875152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.142 #37 NEW cov: 12391 ft: 15493 corp: 36/91b lim: 5 exec/s: 18 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:08:00.142 #37 DONE cov: 12391 ft: 15493 corp: 36/91b lim: 5 exec/s: 18 rss: 74Mb 00:08:00.142 Done 37 runs in 2 second(s) 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.142 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.143 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.143 01:23:46 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:00.143 [2024-12-17 01:23:46.068465] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:00.143 [2024-12-17 01:23:46.068536] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid827189 ] 00:08:00.401 [2024-12-17 01:23:46.245464] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.401 [2024-12-17 01:23:46.266735] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.401 [2024-12-17 01:23:46.319115] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.401 [2024-12-17 01:23:46.335459] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:00.401 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.401 INFO: Seed: 1439278461 00:08:00.401 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:00.401 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:00.401 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:00.401 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.401 [2024-12-17 01:23:46.384365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.401 [2024-12-17 01:23:46.384394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.401 #2 INITED cov: 12164 ft: 12152 corp: 1/1b exec/s: 0 rss: 70Mb 00:08:00.659 [2024-12-17 01:23:46.424423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.659 [2024-12-17 01:23:46.424449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.659 #3 NEW cov: 12277 ft: 12728 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeBinInt- 00:08:00.659 [2024-12-17 01:23:46.484598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.659 [2024-12-17 01:23:46.484624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.659 #4 NEW cov: 12283 ft: 13147 corp: 3/3b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ShuffleBytes- 00:08:00.659 [2024-12-17 01:23:46.524653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.659 [2024-12-17 01:23:46.524681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.659 #5 NEW cov: 12368 ft: 13432 corp: 4/4b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ShuffleBytes- 00:08:00.659 [2024-12-17 01:23:46.565482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.659 [2024-12-17 01:23:46.565508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.659 [2024-12-17 01:23:46.565565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.659 [2024-12-17 01:23:46.565579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.659 [2024-12-17 01:23:46.565634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.659 [2024-12-17 01:23:46.565648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.659 [2024-12-17 01:23:46.565704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.659 [2024-12-17 01:23:46.565717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.659 [2024-12-17 01:23:46.565773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.660 [2024-12-17 01:23:46.565786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.660 #6 NEW cov: 12368 ft: 14366 corp: 5/9b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:00.660 [2024-12-17 01:23:46.624974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.660 [2024-12-17 01:23:46.625001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.919 #7 NEW cov: 12368 ft: 14463 corp: 6/10b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:00.919 [2024-12-17 01:23:46.685143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.685169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.919 #8 NEW cov: 12368 ft: 14516 corp: 7/11b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 CrossOver- 00:08:00.919 [2024-12-17 01:23:46.745804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.745831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.745887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.745901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.745956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.745970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.746028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.746042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.919 #9 NEW cov: 12368 ft: 14551 corp: 8/15b lim: 5 exec/s: 0 rss: 71Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:00.919 [2024-12-17 01:23:46.785552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.785579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.785635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.785649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.919 #10 NEW cov: 12368 ft: 14769 corp: 9/17b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 CrossOver- 00:08:00.919 [2024-12-17 01:23:46.826146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.826172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.826227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.826240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.826296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.826311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.826366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.826379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.826432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.826445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.919 #11 NEW cov: 12368 ft: 14811 corp: 10/22b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:00.919 [2024-12-17 01:23:46.865802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.865828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.865887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.865901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.919 #12 NEW cov: 12368 ft: 14827 corp: 11/24b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 CrossOver- 00:08:00.919 [2024-12-17 01:23:46.906295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.906324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.906381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.906394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.906450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.906463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.919 [2024-12-17 01:23:46.906518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.919 [2024-12-17 01:23:46.906531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.180 #13 NEW cov: 12368 ft: 14853 corp: 12/28b lim: 5 exec/s: 0 rss: 71Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:01.180 [2024-12-17 01:23:46.946539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.180 [2024-12-17 01:23:46.946565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:46.946618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:46.946632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:46.946688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:46.946701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:46.946757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:46.946770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:46.946821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:46.946835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.181 #14 NEW cov: 12368 ft: 14862 corp: 13/33b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:08:01.181 [2024-12-17 01:23:46.986017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:46.986044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.181 #15 NEW cov: 12368 ft: 14871 corp: 14/34b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ChangeByte- 00:08:01.181 [2024-12-17 01:23:47.026780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.026810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.026867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.026885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.026940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.026954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.027013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.027026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.027080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.027094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.181 #16 NEW cov: 12368 ft: 14878 corp: 15/39b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeByte- 00:08:01.181 [2024-12-17 01:23:47.086941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.086969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.087030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.087044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.087102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.087115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.087171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.087184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.087254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.087268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.181 #17 NEW cov: 12368 ft: 14892 corp: 16/44b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:01.181 [2024-12-17 01:23:47.147122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.147146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.147205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.147219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.147274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.147291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.147346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.147360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.181 [2024-12-17 01:23:47.147416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.181 [2024-12-17 01:23:47.147430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.440 #18 NEW cov: 12368 ft: 14929 corp: 17/49b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CopyPart- 00:08:01.440 [2024-12-17 01:23:47.207157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.440 [2024-12-17 01:23:47.207182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.440 [2024-12-17 01:23:47.207240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.440 [2024-12-17 01:23:47.207254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.440 [2024-12-17 01:23:47.207309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.440 [2024-12-17 01:23:47.207323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.440 [2024-12-17 01:23:47.207379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.440 [2024-12-17 01:23:47.207392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.440 #19 NEW cov: 12368 ft: 14955 corp: 18/53b lim: 5 exec/s: 0 rss: 72Mb L: 4/5 MS: 1 CopyPart- 00:08:01.440 [2024-12-17 01:23:47.267315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.440 [2024-12-17 01:23:47.267341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.440 [2024-12-17 01:23:47.267398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.440 [2024-12-17 01:23:47.267412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.440 [2024-12-17 01:23:47.267468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.440 [2024-12-17 01:23:47.267482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.440 [2024-12-17 01:23:47.267538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.440 [2024-12-17 01:23:47.267552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.698 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:01.698 #20 NEW cov: 12391 ft: 14985 corp: 19/57b lim: 5 exec/s: 20 rss: 73Mb L: 4/5 MS: 1 EraseBytes- 00:08:01.698 [2024-12-17 01:23:47.578306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.578337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.698 [2024-12-17 01:23:47.578395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.578409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.698 [2024-12-17 01:23:47.578466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.578479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.698 [2024-12-17 01:23:47.578535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.578549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.698 [2024-12-17 01:23:47.578603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.578617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.698 #21 NEW cov: 12391 ft: 15085 corp: 20/62b lim: 5 exec/s: 21 rss: 73Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:01.698 [2024-12-17 01:23:47.638363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.638388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.698 [2024-12-17 01:23:47.638446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.638460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.698 [2024-12-17 01:23:47.638519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.638533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.698 [2024-12-17 01:23:47.638589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.698 [2024-12-17 01:23:47.638603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.699 [2024-12-17 01:23:47.638660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.699 [2024-12-17 01:23:47.638674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.699 #22 NEW cov: 12391 ft: 15100 corp: 21/67b lim: 5 exec/s: 22 rss: 73Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:01.699 [2024-12-17 01:23:47.698084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.699 [2024-12-17 01:23:47.698109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.699 [2024-12-17 01:23:47.698174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.699 [2024-12-17 01:23:47.698189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.957 #23 NEW cov: 12391 ft: 15138 corp: 22/69b lim: 5 exec/s: 23 rss: 73Mb L: 2/5 MS: 1 CopyPart- 00:08:01.957 [2024-12-17 01:23:47.758681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.957 [2024-12-17 01:23:47.758706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.957 [2024-12-17 01:23:47.758765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.957 [2024-12-17 01:23:47.758779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.957 [2024-12-17 01:23:47.758855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.957 [2024-12-17 01:23:47.758881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.957 [2024-12-17 01:23:47.758940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.957 [2024-12-17 01:23:47.758953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.957 [2024-12-17 01:23:47.759011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.957 [2024-12-17 01:23:47.759024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.957 #24 NEW cov: 12391 ft: 15147 corp: 23/74b lim: 5 exec/s: 24 rss: 73Mb L: 5/5 MS: 1 ChangeByte- 00:08:01.957 [2024-12-17 01:23:47.798162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.957 [2024-12-17 01:23:47.798187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.957 #25 NEW cov: 12391 ft: 15184 corp: 24/75b lim: 5 exec/s: 25 rss: 73Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:01.957 [2024-12-17 01:23:47.838547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.957 [2024-12-17 01:23:47.838572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.957 [2024-12-17 01:23:47.838631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.838645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.958 [2024-12-17 01:23:47.838703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.838717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.958 #26 NEW cov: 12391 ft: 15375 corp: 25/78b lim: 5 exec/s: 26 rss: 73Mb L: 3/5 MS: 1 EraseBytes- 00:08:01.958 [2024-12-17 01:23:47.898953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.898981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.958 [2024-12-17 01:23:47.899039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.899054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.958 [2024-12-17 01:23:47.899109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.899123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.958 [2024-12-17 01:23:47.899181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.899194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.958 #27 NEW cov: 12391 ft: 15429 corp: 26/82b lim: 5 exec/s: 27 rss: 73Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:01.958 [2024-12-17 01:23:47.959340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.959367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.958 [2024-12-17 01:23:47.959427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.959441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.958 [2024-12-17 01:23:47.959498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.959512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.958 [2024-12-17 01:23:47.959571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.959585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.958 [2024-12-17 01:23:47.959642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.958 [2024-12-17 01:23:47.959656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.217 #28 NEW cov: 12391 ft: 15443 corp: 27/87b lim: 5 exec/s: 28 rss: 73Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:02.217 [2024-12-17 01:23:47.998721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:47.998746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.217 #29 NEW cov: 12391 ft: 15465 corp: 28/88b lim: 5 exec/s: 29 rss: 73Mb L: 1/5 MS: 1 ChangeBit- 00:08:02.217 [2024-12-17 01:23:48.039453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.039477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.039564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.039579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.039633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.039646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.039699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.039713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.039768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.039781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.217 #30 NEW cov: 12391 ft: 15474 corp: 29/93b lim: 5 exec/s: 30 rss: 73Mb L: 5/5 MS: 1 InsertByte- 00:08:02.217 [2024-12-17 01:23:48.079308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.079333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.079388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.079402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.079460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.079474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.217 #31 NEW cov: 12391 ft: 15498 corp: 30/96b lim: 5 exec/s: 31 rss: 74Mb L: 3/5 MS: 1 EraseBytes- 00:08:02.217 [2024-12-17 01:23:48.119598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.119623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.119696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.119710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.119768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.119782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.119840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.119853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.217 #32 NEW cov: 12391 ft: 15502 corp: 31/100b lim: 5 exec/s: 32 rss: 74Mb L: 4/5 MS: 1 EraseBytes- 00:08:02.217 [2024-12-17 01:23:48.159130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.159155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.217 #33 NEW cov: 12391 ft: 15553 corp: 32/101b lim: 5 exec/s: 33 rss: 74Mb L: 1/5 MS: 1 CopyPart- 00:08:02.217 [2024-12-17 01:23:48.199952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.199978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.200034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.200048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.200105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.200119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.200176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.200189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.217 [2024-12-17 01:23:48.200245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.217 [2024-12-17 01:23:48.200259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.478 #34 NEW cov: 12391 ft: 15582 corp: 33/106b lim: 5 exec/s: 34 rss: 74Mb L: 5/5 MS: 1 ChangeBit- 00:08:02.478 [2024-12-17 01:23:48.259519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.478 [2024-12-17 01:23:48.259544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.478 #35 NEW cov: 12391 ft: 15591 corp: 34/107b lim: 5 exec/s: 35 rss: 74Mb L: 1/5 MS: 1 CrossOver- 00:08:02.478 [2024-12-17 01:23:48.300059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.478 [2024-12-17 01:23:48.300084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.478 [2024-12-17 01:23:48.300141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.478 [2024-12-17 01:23:48.300155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.478 [2024-12-17 01:23:48.300213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.478 [2024-12-17 01:23:48.300226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.478 [2024-12-17 01:23:48.300283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.478 [2024-12-17 01:23:48.300301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.478 #36 NEW cov: 12391 ft: 15611 corp: 35/111b lim: 5 exec/s: 36 rss: 74Mb L: 4/5 MS: 1 ShuffleBytes- 00:08:02.478 [2024-12-17 01:23:48.359755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.478 [2024-12-17 01:23:48.359781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.478 [2024-12-17 01:23:48.399857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.478 [2024-12-17 01:23:48.399882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.478 #38 NEW cov: 12391 ft: 15617 corp: 36/112b lim: 5 exec/s: 19 rss: 74Mb L: 1/5 MS: 2 ShuffleBytes-ChangeByte- 00:08:02.478 #38 DONE cov: 12391 ft: 15617 corp: 36/112b lim: 5 exec/s: 19 rss: 74Mb 00:08:02.478 Done 38 runs in 2 second(s) 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:02.738 01:23:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:02.738 [2024-12-17 01:23:48.575966] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:02.738 [2024-12-17 01:23:48.576061] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid827637 ] 00:08:02.996 [2024-12-17 01:23:48.759854] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.996 [2024-12-17 01:23:48.781408] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.996 [2024-12-17 01:23:48.833683] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.996 [2024-12-17 01:23:48.850017] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:02.996 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.996 INFO: Seed: 3955295225 00:08:02.996 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:02.996 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:02.996 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:02.996 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.996 #2 INITED exec/s: 0 rss: 64Mb 00:08:02.996 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.997 This may also happen if the target rejected all inputs we tried so far 00:08:02.997 [2024-12-17 01:23:48.894853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.997 [2024-12-17 01:23:48.894888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.997 [2024-12-17 01:23:48.894924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.997 [2024-12-17 01:23:48.894940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.255 NEW_FUNC[1/713]: 0x45f648 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:03.255 NEW_FUNC[2/713]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.255 #8 NEW cov: 12170 ft: 12185 corp: 2/17b lim: 40 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:08:03.255 [2024-12-17 01:23:49.245744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.255 [2024-12-17 01:23:49.245782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.255 [2024-12-17 01:23:49.245825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.255 [2024-12-17 01:23:49.245850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.513 NEW_FUNC[1/1]: 0x1f5b618 in thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1067 00:08:03.513 #12 NEW cov: 12300 ft: 12824 corp: 3/39b lim: 40 exec/s: 0 rss: 72Mb L: 22/22 MS: 4 ShuffleBytes-ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:03.513 [2024-12-17 01:23:49.305736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.513 [2024-12-17 01:23:49.305767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.513 [2024-12-17 01:23:49.305824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.513 [2024-12-17 01:23:49.305840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.513 #13 NEW cov: 12306 ft: 13102 corp: 4/60b lim: 40 exec/s: 0 rss: 72Mb L: 21/22 MS: 1 EraseBytes- 00:08:03.513 [2024-12-17 01:23:49.396082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.513 [2024-12-17 01:23:49.396113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.513 [2024-12-17 01:23:49.396148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.513 [2024-12-17 01:23:49.396167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.513 [2024-12-17 01:23:49.396198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aeaeaeae cdw11:aeaeffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.513 [2024-12-17 01:23:49.396214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.513 #19 NEW cov: 12391 ft: 13567 corp: 5/90b lim: 40 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:03.513 [2024-12-17 01:23:49.456149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.513 [2024-12-17 01:23:49.456180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.513 [2024-12-17 01:23:49.456214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.513 [2024-12-17 01:23:49.456230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.772 #20 NEW cov: 12391 ft: 13660 corp: 6/113b lim: 40 exec/s: 0 rss: 72Mb L: 23/30 MS: 1 CopyPart- 00:08:03.772 [2024-12-17 01:23:49.546368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.546397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.546446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.546467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.772 #21 NEW cov: 12391 ft: 13736 corp: 7/135b lim: 40 exec/s: 0 rss: 72Mb L: 22/30 MS: 1 ShuffleBytes- 00:08:03.772 [2024-12-17 01:23:49.596627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.596656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.596691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff8a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.596706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.596737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:3fffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.596753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.596783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.596806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.772 #22 NEW cov: 12391 ft: 14300 corp: 8/174b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 CrossOver- 00:08:03.772 [2024-12-17 01:23:49.656766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.656802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.656842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.656858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.656889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.656904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.656934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.656949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.772 #24 NEW cov: 12391 ft: 14342 corp: 9/212b lim: 40 exec/s: 0 rss: 72Mb L: 38/39 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:03.772 [2024-12-17 01:23:49.716893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.716932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.716981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.717001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.772 [2024-12-17 01:23:49.717032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.772 [2024-12-17 01:23:49.717048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.030 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:04.030 #25 NEW cov: 12414 ft: 14389 corp: 10/238b lim: 40 exec/s: 0 rss: 72Mb L: 26/39 MS: 1 CrossOver- 00:08:04.030 [2024-12-17 01:23:49.807119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.807148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.030 [2024-12-17 01:23:49.807197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.807221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.030 [2024-12-17 01:23:49.807251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff7aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.807267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.030 [2024-12-17 01:23:49.807296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.807312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.030 #26 NEW cov: 12414 ft: 14437 corp: 11/277b lim: 40 exec/s: 26 rss: 72Mb L: 39/39 MS: 1 InsertByte- 00:08:04.030 [2024-12-17 01:23:49.897247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.897280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.030 [2024-12-17 01:23:49.897330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.897353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.030 #27 NEW cov: 12414 ft: 14495 corp: 12/294b lim: 40 exec/s: 27 rss: 72Mb L: 17/39 MS: 1 CrossOver- 00:08:04.030 [2024-12-17 01:23:49.957508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.957537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.030 [2024-12-17 01:23:49.957586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.957610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.030 [2024-12-17 01:23:49.957640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff7aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.957656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.030 [2024-12-17 01:23:49.957686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.030 [2024-12-17 01:23:49.957701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.289 #28 NEW cov: 12414 ft: 14600 corp: 13/333b lim: 40 exec/s: 28 rss: 73Mb L: 39/39 MS: 1 CopyPart- 00:08:04.289 [2024-12-17 01:23:50.048463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.048498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.289 [2024-12-17 01:23:50.048537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffbf8a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.048555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.289 [2024-12-17 01:23:50.048590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:3fffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.048607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.289 [2024-12-17 01:23:50.048641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.048658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.289 #29 NEW cov: 12414 ft: 14616 corp: 14/372b lim: 40 exec/s: 29 rss: 73Mb L: 39/39 MS: 1 ChangeBit- 00:08:04.289 [2024-12-17 01:23:50.148032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.148067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.289 [2024-12-17 01:23:50.148109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:8a3fff8a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.148125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.289 #30 NEW cov: 12414 ft: 14730 corp: 15/389b lim: 40 exec/s: 30 rss: 73Mb L: 17/39 MS: 1 EraseBytes- 00:08:04.289 [2024-12-17 01:23:50.238320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.238351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.289 [2024-12-17 01:23:50.238402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffebebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.238421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.289 [2024-12-17 01:23:50.238452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ebebebeb cdw11:ebebffae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.238469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.289 [2024-12-17 01:23:50.238500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.289 [2024-12-17 01:23:50.238515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.289 #31 NEW cov: 12414 ft: 14734 corp: 16/428b lim: 40 exec/s: 31 rss: 73Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:04.548 [2024-12-17 01:23:50.298362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dfffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.298391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.548 [2024-12-17 01:23:50.298442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.298464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.548 #32 NEW cov: 12414 ft: 14771 corp: 17/449b lim: 40 exec/s: 32 rss: 73Mb L: 21/39 MS: 1 ChangeBit- 00:08:04.548 [2024-12-17 01:23:50.348500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.348530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.548 [2024-12-17 01:23:50.348564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.348579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.548 #33 NEW cov: 12414 ft: 14806 corp: 18/472b lim: 40 exec/s: 33 rss: 73Mb L: 23/39 MS: 1 EraseBytes- 00:08:04.548 [2024-12-17 01:23:50.408710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.408739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.548 [2024-12-17 01:23:50.408790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff7affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.408816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.548 [2024-12-17 01:23:50.408849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.408864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.548 #34 NEW cov: 12414 ft: 14815 corp: 19/502b lim: 40 exec/s: 34 rss: 73Mb L: 30/39 MS: 1 EraseBytes- 00:08:04.548 [2024-12-17 01:23:50.458908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.458937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.548 [2024-12-17 01:23:50.458971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffff65 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.458987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.548 [2024-12-17 01:23:50.459021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.459036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.548 [2024-12-17 01:23:50.459066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.459081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.548 #35 NEW cov: 12414 ft: 14833 corp: 20/540b lim: 40 exec/s: 35 rss: 73Mb L: 38/39 MS: 1 ChangeByte- 00:08:04.548 [2024-12-17 01:23:50.508922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.508951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.548 [2024-12-17 01:23:50.509002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.548 [2024-12-17 01:23:50.509022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.548 #36 NEW cov: 12414 ft: 14853 corp: 21/563b lim: 40 exec/s: 36 rss: 73Mb L: 23/39 MS: 1 CopyPart- 00:08:04.807 [2024-12-17 01:23:50.569073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffbfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.569101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.807 [2024-12-17 01:23:50.569152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:8a3fff8a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.569171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.807 #37 NEW cov: 12414 ft: 14871 corp: 22/580b lim: 40 exec/s: 37 rss: 73Mb L: 17/39 MS: 1 ChangeBit- 00:08:04.807 [2024-12-17 01:23:50.659446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.659477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.807 [2024-12-17 01:23:50.659512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffebebeb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.659532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.807 [2024-12-17 01:23:50.659564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ebebebeb cdw11:ebebffae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.659580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.807 [2024-12-17 01:23:50.659610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.659625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.807 #38 NEW cov: 12414 ft: 14874 corp: 23/619b lim: 40 exec/s: 38 rss: 73Mb L: 39/39 MS: 1 ChangeBinInt- 00:08:04.807 [2024-12-17 01:23:50.749647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.749678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.807 [2024-12-17 01:23:50.749713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffaeae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.749729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.807 [2024-12-17 01:23:50.749761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aeaeaeae cdw11:aeaeffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.749776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.807 #39 NEW cov: 12414 ft: 14980 corp: 24/649b lim: 40 exec/s: 39 rss: 73Mb L: 30/39 MS: 1 ShuffleBytes- 00:08:04.807 [2024-12-17 01:23:50.799766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.799802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.807 [2024-12-17 01:23:50.799837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff7affff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.799869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.807 [2024-12-17 01:23:50.799901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff54ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.807 [2024-12-17 01:23:50.799917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.066 #40 NEW cov: 12414 ft: 14986 corp: 25/679b lim: 40 exec/s: 40 rss: 73Mb L: 30/39 MS: 1 ChangeByte- 00:08:05.066 [2024-12-17 01:23:50.890026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.066 [2024-12-17 01:23:50.890055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.066 [2024-12-17 01:23:50.890089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffff65 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.066 [2024-12-17 01:23:50.890105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.066 [2024-12-17 01:23:50.890140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffa8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.066 [2024-12-17 01:23:50.890155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.066 [2024-12-17 01:23:50.890185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.066 [2024-12-17 01:23:50.890200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.066 #41 NEW cov: 12414 ft: 15007 corp: 26/717b lim: 40 exec/s: 20 rss: 73Mb L: 38/39 MS: 1 ChangeByte- 00:08:05.066 #41 DONE cov: 12414 ft: 15007 corp: 26/717b lim: 40 exec/s: 20 rss: 73Mb 00:08:05.066 Done 41 runs in 2 second(s) 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:05.324 01:23:51 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:05.324 [2024-12-17 01:23:51.119519] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:05.324 [2024-12-17 01:23:51.119591] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid828010 ] 00:08:05.583 [2024-12-17 01:23:51.383160] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.583 [2024-12-17 01:23:51.412471] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.583 [2024-12-17 01:23:51.464920] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.583 [2024-12-17 01:23:51.481226] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:05.583 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.583 INFO: Seed: 2291292281 00:08:05.583 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:05.583 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:05.583 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:05.583 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.583 #2 INITED exec/s: 0 rss: 64Mb 00:08:05.583 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.583 This may also happen if the target rejected all inputs we tried so far 00:08:05.583 [2024-12-17 01:23:51.526717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.583 [2024-12-17 01:23:51.526745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.841 NEW_FUNC[1/715]: 0x4613b8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:05.841 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.841 #3 NEW cov: 12199 ft: 12188 corp: 2/9b lim: 40 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:06.099 [2024-12-17 01:23:51.857564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:51.857596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.099 #4 NEW cov: 12312 ft: 12732 corp: 3/23b lim: 40 exec/s: 0 rss: 72Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:08:06.099 [2024-12-17 01:23:51.897732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:51.897758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.099 [2024-12-17 01:23:51.897826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:51.897840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.099 #5 NEW cov: 12318 ft: 13691 corp: 4/39b lim: 40 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 CrossOver- 00:08:06.099 [2024-12-17 01:23:51.957769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:51.957800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.099 #6 NEW cov: 12403 ft: 14006 corp: 5/54b lim: 40 exec/s: 0 rss: 72Mb L: 15/16 MS: 1 InsertByte- 00:08:06.099 [2024-12-17 01:23:52.017942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:52.017966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.099 #7 NEW cov: 12403 ft: 14116 corp: 6/69b lim: 40 exec/s: 0 rss: 72Mb L: 15/16 MS: 1 ChangeBinInt- 00:08:06.099 [2024-12-17 01:23:52.078613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:52.078638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.099 [2024-12-17 01:23:52.078703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:52.078717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.099 [2024-12-17 01:23:52.078781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:52.078799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.099 [2024-12-17 01:23:52.078862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.099 [2024-12-17 01:23:52.078875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.358 #8 NEW cov: 12403 ft: 14499 corp: 7/104b lim: 40 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:06.358 [2024-12-17 01:23:52.118160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.358 [2024-12-17 01:23:52.118186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.358 #9 NEW cov: 12403 ft: 14711 corp: 8/118b lim: 40 exec/s: 0 rss: 72Mb L: 14/35 MS: 1 CopyPart- 00:08:06.358 [2024-12-17 01:23:52.158301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.358 [2024-12-17 01:23:52.158325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.358 #10 NEW cov: 12403 ft: 14828 corp: 9/133b lim: 40 exec/s: 0 rss: 72Mb L: 15/35 MS: 1 InsertByte- 00:08:06.358 [2024-12-17 01:23:52.218524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.358 [2024-12-17 01:23:52.218549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.358 #11 NEW cov: 12403 ft: 14876 corp: 10/141b lim: 40 exec/s: 0 rss: 72Mb L: 8/35 MS: 1 CrossOver- 00:08:06.358 [2024-12-17 01:23:52.258608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.358 [2024-12-17 01:23:52.258632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.358 #12 NEW cov: 12403 ft: 14946 corp: 11/155b lim: 40 exec/s: 0 rss: 72Mb L: 14/35 MS: 1 CrossOver- 00:08:06.358 [2024-12-17 01:23:52.298711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.358 [2024-12-17 01:23:52.298736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.358 #13 NEW cov: 12403 ft: 14961 corp: 12/170b lim: 40 exec/s: 0 rss: 72Mb L: 15/35 MS: 1 ChangeBit- 00:08:06.358 [2024-12-17 01:23:52.338842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.358 [2024-12-17 01:23:52.338867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.358 #16 NEW cov: 12403 ft: 15005 corp: 13/184b lim: 40 exec/s: 0 rss: 72Mb L: 14/35 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:08:06.616 [2024-12-17 01:23:52.378959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.616 [2024-12-17 01:23:52.378984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.616 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:06.616 #17 NEW cov: 12426 ft: 15018 corp: 14/198b lim: 40 exec/s: 0 rss: 73Mb L: 14/35 MS: 1 ChangeBit- 00:08:06.616 [2024-12-17 01:23:52.439135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.616 [2024-12-17 01:23:52.439161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.616 #18 NEW cov: 12426 ft: 15086 corp: 15/213b lim: 40 exec/s: 0 rss: 73Mb L: 15/35 MS: 1 ShuffleBytes- 00:08:06.616 [2024-12-17 01:23:52.479265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.616 [2024-12-17 01:23:52.479289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.616 #19 NEW cov: 12426 ft: 15113 corp: 16/227b lim: 40 exec/s: 0 rss: 73Mb L: 14/35 MS: 1 ShuffleBytes- 00:08:06.616 [2024-12-17 01:23:52.519370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.616 [2024-12-17 01:23:52.519394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.616 #20 NEW cov: 12426 ft: 15130 corp: 17/241b lim: 40 exec/s: 20 rss: 73Mb L: 14/35 MS: 1 ChangeByte- 00:08:06.616 [2024-12-17 01:23:52.579567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0a2c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.616 [2024-12-17 01:23:52.579591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.616 #22 NEW cov: 12426 ft: 15173 corp: 18/249b lim: 40 exec/s: 22 rss: 73Mb L: 8/35 MS: 2 EraseBytes-InsertByte- 00:08:06.616 [2024-12-17 01:23:52.619679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffefffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.617 [2024-12-17 01:23:52.619704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.875 #23 NEW cov: 12426 ft: 15216 corp: 19/257b lim: 40 exec/s: 23 rss: 73Mb L: 8/35 MS: 1 ChangeBit- 00:08:06.875 [2024-12-17 01:23:52.659987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.660011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.875 [2024-12-17 01:23:52.660072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff2e250a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.660085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.875 #24 NEW cov: 12426 ft: 15233 corp: 20/273b lim: 40 exec/s: 24 rss: 73Mb L: 16/35 MS: 1 InsertByte- 00:08:06.875 [2024-12-17 01:23:52.719989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.720013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.875 #25 NEW cov: 12426 ft: 15248 corp: 21/282b lim: 40 exec/s: 25 rss: 73Mb L: 9/35 MS: 1 EraseBytes- 00:08:06.875 [2024-12-17 01:23:52.760611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.760636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.875 [2024-12-17 01:23:52.760700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.760714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.875 [2024-12-17 01:23:52.760782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.760799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.875 [2024-12-17 01:23:52.760862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:22222200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.760876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.875 #26 NEW cov: 12426 ft: 15274 corp: 22/320b lim: 40 exec/s: 26 rss: 73Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:06.875 [2024-12-17 01:23:52.820430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.820456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.875 [2024-12-17 01:23:52.820521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff31 cdw11:ffff2e25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.875 [2024-12-17 01:23:52.820535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.875 #27 NEW cov: 12426 ft: 15299 corp: 23/337b lim: 40 exec/s: 27 rss: 73Mb L: 17/38 MS: 1 InsertByte- 00:08:07.133 [2024-12-17 01:23:52.880405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.133 [2024-12-17 01:23:52.880431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.133 #28 NEW cov: 12426 ft: 15331 corp: 24/351b lim: 40 exec/s: 28 rss: 73Mb L: 14/38 MS: 1 ChangeBit- 00:08:07.133 [2024-12-17 01:23:52.940889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.133 [2024-12-17 01:23:52.940913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.133 [2024-12-17 01:23:52.940976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff2e25ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.133 [2024-12-17 01:23:52.940990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.133 [2024-12-17 01:23:52.941052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.133 [2024-12-17 01:23:52.941065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.133 #29 NEW cov: 12426 ft: 15530 corp: 25/381b lim: 40 exec/s: 29 rss: 73Mb L: 30/38 MS: 1 InsertRepeatedBytes- 00:08:07.133 [2024-12-17 01:23:52.981226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.134 [2024-12-17 01:23:52.981251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.134 [2024-12-17 01:23:52.981313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.134 [2024-12-17 01:23:52.981327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.134 [2024-12-17 01:23:52.981389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.134 [2024-12-17 01:23:52.981406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.134 [2024-12-17 01:23:52.981467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.134 [2024-12-17 01:23:52.981481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.134 #30 NEW cov: 12426 ft: 15550 corp: 26/420b lim: 40 exec/s: 30 rss: 73Mb L: 39/39 MS: 1 CrossOver- 00:08:07.134 [2024-12-17 01:23:53.020731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff8a8a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.134 [2024-12-17 01:23:53.020756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.134 #31 NEW cov: 12426 ft: 15595 corp: 27/435b lim: 40 exec/s: 31 rss: 73Mb L: 15/39 MS: 1 InsertRepeatedBytes- 00:08:07.134 [2024-12-17 01:23:53.080929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffeffffd cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.134 [2024-12-17 01:23:53.080955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.134 #32 NEW cov: 12426 ft: 15602 corp: 28/443b lim: 40 exec/s: 32 rss: 73Mb L: 8/39 MS: 1 ChangeBit- 00:08:07.392 [2024-12-17 01:23:53.141647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.141673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.392 [2024-12-17 01:23:53.141737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3e3e3e3e cdw11:3e3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.141752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.392 [2024-12-17 01:23:53.141809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3e3e3e3e cdw11:3e3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.141824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.392 [2024-12-17 01:23:53.141886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3e3e3e3e cdw11:3e3effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.141900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.392 #33 NEW cov: 12426 ft: 15625 corp: 29/482b lim: 40 exec/s: 33 rss: 73Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:07.392 [2024-12-17 01:23:53.201635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.201660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.392 [2024-12-17 01:23:53.201739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.201754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.392 [2024-12-17 01:23:53.201821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.201835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.392 #34 NEW cov: 12426 ft: 15644 corp: 30/511b lim: 40 exec/s: 34 rss: 74Mb L: 29/39 MS: 1 CrossOver- 00:08:07.392 [2024-12-17 01:23:53.261502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.261526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.392 #35 NEW cov: 12426 ft: 15711 corp: 31/526b lim: 40 exec/s: 35 rss: 74Mb L: 15/39 MS: 1 ShuffleBytes- 00:08:07.392 [2024-12-17 01:23:53.301549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.301574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.392 #36 NEW cov: 12426 ft: 15733 corp: 32/536b lim: 40 exec/s: 36 rss: 74Mb L: 10/39 MS: 1 CopyPart- 00:08:07.392 [2024-12-17 01:23:53.361723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.392 [2024-12-17 01:23:53.361747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.392 #37 NEW cov: 12426 ft: 15737 corp: 33/550b lim: 40 exec/s: 37 rss: 74Mb L: 14/39 MS: 1 ChangeBinInt- 00:08:07.651 [2024-12-17 01:23:53.402136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.402161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.651 [2024-12-17 01:23:53.402225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0a7f7f7f cdw11:7f7f7f7f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.402240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.651 [2024-12-17 01:23:53.402302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.402316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.651 #38 NEW cov: 12426 ft: 15796 corp: 34/580b lim: 40 exec/s: 38 rss: 74Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:08:07.651 [2024-12-17 01:23:53.442248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.442272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.651 [2024-12-17 01:23:53.442333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0a7fec7f cdw11:7f7f7f7f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.442347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.651 [2024-12-17 01:23:53.442409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.442423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.651 #39 NEW cov: 12426 ft: 15810 corp: 35/611b lim: 40 exec/s: 39 rss: 74Mb L: 31/39 MS: 1 InsertByte- 00:08:07.651 [2024-12-17 01:23:53.502640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.502664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.651 [2024-12-17 01:23:53.502730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3e3e3e3e cdw11:3e3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.502748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.651 [2024-12-17 01:23:53.502811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3e3e3e3e cdw11:3e3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.502826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.651 [2024-12-17 01:23:53.502889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3e3e3e3e cdw11:3e3effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.651 [2024-12-17 01:23:53.502903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.651 #40 NEW cov: 12426 ft: 15828 corp: 36/650b lim: 40 exec/s: 20 rss: 74Mb L: 39/39 MS: 1 ChangeASCIIInt- 00:08:07.651 #40 DONE cov: 12426 ft: 15828 corp: 36/650b lim: 40 exec/s: 20 rss: 74Mb 00:08:07.651 Done 40 runs in 2 second(s) 00:08:07.910 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:07.910 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:07.910 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.910 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:07.910 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:07.910 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:07.910 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.910 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:07.911 01:23:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:07.911 [2024-12-17 01:23:53.706222] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:07.911 [2024-12-17 01:23:53.706295] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid828544 ] 00:08:08.169 [2024-12-17 01:23:53.963070] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.169 [2024-12-17 01:23:53.993764] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.169 [2024-12-17 01:23:54.046054] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.169 [2024-12-17 01:23:54.062398] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:08.169 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.169 INFO: Seed: 577352293 00:08:08.169 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:08.169 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:08.169 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:08.169 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.169 #2 INITED exec/s: 0 rss: 64Mb 00:08:08.169 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.169 This may also happen if the target rejected all inputs we tried so far 00:08:08.169 [2024-12-17 01:23:54.108546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.169 [2024-12-17 01:23:54.108576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.169 [2024-12-17 01:23:54.108651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.169 [2024-12-17 01:23:54.108666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.169 [2024-12-17 01:23:54.108725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.169 [2024-12-17 01:23:54.108739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.169 [2024-12-17 01:23:54.108803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.169 [2024-12-17 01:23:54.108817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.169 [2024-12-17 01:23:54.108874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.169 [2024-12-17 01:23:54.108888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.431 NEW_FUNC[1/715]: 0x463128 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:08.431 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.431 #6 NEW cov: 12192 ft: 12175 corp: 2/41b lim: 40 exec/s: 0 rss: 71Mb L: 40/40 MS: 4 CMP-CMP-ChangeBinInt-InsertRepeatedBytes- DE: "\001\000\000s"-"\023\001"- 00:08:08.714 [2024-12-17 01:23:54.439299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.439333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.714 [2024-12-17 01:23:54.439390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.439404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.714 [2024-12-17 01:23:54.439458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.439471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.714 [2024-12-17 01:23:54.439531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.439544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.714 [2024-12-17 01:23:54.439600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.439613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.714 #7 NEW cov: 12310 ft: 12710 corp: 3/81b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 CopyPart- 00:08:08.714 [2024-12-17 01:23:54.499299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.499327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.714 [2024-12-17 01:23:54.499382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.499396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.714 [2024-12-17 01:23:54.499451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:d4949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.499464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.714 [2024-12-17 01:23:54.499516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.714 [2024-12-17 01:23:54.499529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.499584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.499598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.715 #8 NEW cov: 12316 ft: 13006 corp: 4/121b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBit- 00:08:08.715 [2024-12-17 01:23:54.539423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.539452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.539507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.539520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.539575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.539588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.539644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.539656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.539709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.539725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.715 #9 NEW cov: 12401 ft: 13259 corp: 5/161b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeByte- 00:08:08.715 [2024-12-17 01:23:54.599587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.599612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.599666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.599679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.599735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.599748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.599807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.599820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.599872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.599885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.715 #10 NEW cov: 12401 ft: 13299 corp: 6/201b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:08.715 [2024-12-17 01:23:54.659750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.659776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.659832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.659846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.659898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.659911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.659966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.659978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.715 [2024-12-17 01:23:54.660032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.715 [2024-12-17 01:23:54.660044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.715 #11 NEW cov: 12401 ft: 13348 corp: 7/241b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:09.091 [2024-12-17 01:23:54.719888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.719919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.719978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.719992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.720044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.720058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.720109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94948494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.720123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.720176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.720190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.091 #12 NEW cov: 12401 ft: 13457 corp: 8/281b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBit- 00:08:09.091 [2024-12-17 01:23:54.780047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.780073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.780128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1f000000 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.780142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.780197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.780210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.780263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.780275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.780330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.780343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.091 #13 NEW cov: 12401 ft: 13611 corp: 9/321b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 CMP- DE: "\037\000\000\000"- 00:08:09.091 [2024-12-17 01:23:54.820129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94959494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.820154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.820226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.820244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.820300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.820313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.820369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.820382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.820446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.820459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.091 #14 NEW cov: 12401 ft: 13687 corp: 10/361b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBit- 00:08:09.091 [2024-12-17 01:23:54.860270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.860295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.860352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94d49494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.860365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.860419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:d4949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.860432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.860489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.860502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.091 [2024-12-17 01:23:54.860557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.091 [2024-12-17 01:23:54.860569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.091 #15 NEW cov: 12401 ft: 13732 corp: 11/401b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBit- 00:08:09.091 [2024-12-17 01:23:54.900383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94130194 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.900408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.900464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.900477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.900531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.900543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.900600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.900613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.900668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.900680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.092 #16 NEW cov: 12401 ft: 13744 corp: 12/441b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 PersAutoDict- DE: "\023\001"- 00:08:09.092 [2024-12-17 01:23:54.940454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.940478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.940532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000010 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.940545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.940601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.940614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.940670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.940683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.940738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.940751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.092 #17 NEW cov: 12401 ft: 13755 corp: 13/481b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\020"- 00:08:09.092 [2024-12-17 01:23:54.980585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:948a9494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.980610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.980683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.980697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.980753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.980766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.980820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.980833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:54.980893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:54.980905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.092 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:09.092 #18 NEW cov: 12424 ft: 13796 corp: 14/521b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:09.092 [2024-12-17 01:23:55.040732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94736b6b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:55.040757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:55.040821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:6b949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:55.040836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:55.040890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:55.040903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:55.040957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:55.040970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.092 [2024-12-17 01:23:55.041023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.092 [2024-12-17 01:23:55.041036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.355 #19 NEW cov: 12424 ft: 13824 corp: 15/561b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:09.355 [2024-12-17 01:23:55.100452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.355 [2024-12-17 01:23:55.100477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.355 [2024-12-17 01:23:55.100537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:d4949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.355 [2024-12-17 01:23:55.100551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.356 #20 NEW cov: 12424 ft: 14268 corp: 16/580b lim: 40 exec/s: 20 rss: 72Mb L: 19/40 MS: 1 CrossOver- 00:08:09.356 [2024-12-17 01:23:55.141038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94948494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.141063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.141117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.141131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.141188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:d4949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.141206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.141264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.141277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.141334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.141347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.356 #21 NEW cov: 12424 ft: 14312 corp: 17/620b lim: 40 exec/s: 21 rss: 72Mb L: 40/40 MS: 1 ChangeBit- 00:08:09.356 [2024-12-17 01:23:55.181166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94736b6b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.181191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.181250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6b6b6b42 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.181264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.181319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.181332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.181388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94940094 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.181401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.181456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.181469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.356 #22 NEW cov: 12424 ft: 14377 corp: 18/660b lim: 40 exec/s: 22 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:09.356 [2024-12-17 01:23:55.241325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94130194 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.241351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.241405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.241419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.241491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.241505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.241560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.241573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.241632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.241646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.356 #23 NEW cov: 12424 ft: 14384 corp: 19/700b lim: 40 exec/s: 23 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:09.356 [2024-12-17 01:23:55.301481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.301505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.301562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.301575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.301631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.301644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.301698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94948494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.301710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.356 [2024-12-17 01:23:55.301764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa01130a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.356 [2024-12-17 01:23:55.301776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.356 #24 NEW cov: 12424 ft: 14393 corp: 20/740b lim: 40 exec/s: 24 rss: 73Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:09.614 [2024-12-17 01:23:55.361652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94130194 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.614 [2024-12-17 01:23:55.361677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.614 [2024-12-17 01:23:55.361740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.361757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.361821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.361835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.361893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94948494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.361906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.361965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa01130a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.361982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.615 #25 NEW cov: 12424 ft: 14432 corp: 21/780b lim: 40 exec/s: 25 rss: 73Mb L: 40/40 MS: 1 PersAutoDict- DE: "\023\001"- 00:08:09.615 [2024-12-17 01:23:55.421534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94130194 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.421559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.421631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.421645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.421700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.421713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.615 #26 NEW cov: 12424 ft: 14624 corp: 22/806b lim: 40 exec/s: 26 rss: 73Mb L: 26/40 MS: 1 EraseBytes- 00:08:09.615 [2024-12-17 01:23:55.461759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94130194 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.461784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.461850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.461864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.461920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.461933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.461987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949484 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.462000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.615 #27 NEW cov: 12424 ft: 14666 corp: 23/843b lim: 40 exec/s: 27 rss: 73Mb L: 37/40 MS: 1 EraseBytes- 00:08:09.615 [2024-12-17 01:23:55.522081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.522105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.522163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94d49494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.522176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.522246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:d4949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.522260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.522317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.522330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.522389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.522402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.615 #28 NEW cov: 12424 ft: 14677 corp: 24/883b lim: 40 exec/s: 28 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:09.615 [2024-12-17 01:23:55.582268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94959494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.582292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.582349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.582363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.582418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.582431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.582486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.582499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.615 [2024-12-17 01:23:55.582556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949442 cdw11:9413010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.615 [2024-12-17 01:23:55.582568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.615 #29 NEW cov: 12424 ft: 14696 corp: 25/923b lim: 40 exec/s: 29 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:09.874 [2024-12-17 01:23:55.622413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.622439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.622497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1f000000 cdw11:949494b4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.622511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.622565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.622579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.622633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.622647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.622702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.622716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.874 #30 NEW cov: 12424 ft: 14706 corp: 26/963b lim: 40 exec/s: 30 rss: 73Mb L: 40/40 MS: 1 ChangeBit- 00:08:09.874 [2024-12-17 01:23:55.682188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94130194 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.682212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.682289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949442 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.682303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.682360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.682373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.874 #31 NEW cov: 12424 ft: 14721 corp: 27/994b lim: 40 exec/s: 31 rss: 73Mb L: 31/40 MS: 1 EraseBytes- 00:08:09.874 [2024-12-17 01:23:55.742662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94959494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.742687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.742761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.742775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.742834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949490 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.742848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.742903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.742917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.742974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949442 cdw11:9413010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.742987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.874 #32 NEW cov: 12424 ft: 14727 corp: 28/1034b lim: 40 exec/s: 32 rss: 73Mb L: 40/40 MS: 1 ChangeBit- 00:08:09.874 [2024-12-17 01:23:55.802831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.802856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.802913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.802927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.802998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.803012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.803069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.803086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.803143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.803157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.874 #33 NEW cov: 12424 ft: 14749 corp: 29/1074b lim: 40 exec/s: 33 rss: 73Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:09.874 [2024-12-17 01:23:55.842980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94736b6b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.874 [2024-12-17 01:23:55.843005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.874 [2024-12-17 01:23:55.843064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:6b6b6b42 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.875 [2024-12-17 01:23:55.843077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.875 [2024-12-17 01:23:55.843132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.875 [2024-12-17 01:23:55.843145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.875 [2024-12-17 01:23:55.843200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.875 [2024-12-17 01:23:55.843213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.875 [2024-12-17 01:23:55.843268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949494 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.875 [2024-12-17 01:23:55.843281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.134 #34 NEW cov: 12424 ft: 14754 corp: 30/1114b lim: 40 exec/s: 34 rss: 74Mb L: 40/40 MS: 1 CrossOver- 00:08:10.134 [2024-12-17 01:23:55.903181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9494ff94 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:55.903207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:55.903262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1f000000 cdw11:949494b4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:55.903275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:55.903330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:55.903342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:55.903396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:55.903409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:55.903463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:55.903480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.134 #35 NEW cov: 12424 ft: 14813 corp: 31/1154b lim: 40 exec/s: 35 rss: 74Mb L: 40/40 MS: 1 ChangeByte- 00:08:10.134 [2024-12-17 01:23:55.963007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94130194 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:55.963033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:55.963087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94b49442 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:55.963100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:55.963154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:55.963166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.134 #36 NEW cov: 12424 ft: 14831 corp: 32/1185b lim: 40 exec/s: 36 rss: 74Mb L: 31/40 MS: 1 ChangeBit- 00:08:10.134 [2024-12-17 01:23:56.023441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.023467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.023519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:42949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.023532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.023587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.023599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.023655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.023667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.023721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.023734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.134 #37 NEW cov: 12424 ft: 14836 corp: 33/1225b lim: 40 exec/s: 37 rss: 74Mb L: 40/40 MS: 1 CopyPart- 00:08:10.134 [2024-12-17 01:23:56.063610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.063636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.063692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.063705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.063758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:42949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.063780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.063833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.063847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.063897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.063910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.134 #38 NEW cov: 12424 ft: 14876 corp: 34/1265b lim: 40 exec/s: 38 rss: 74Mb L: 40/40 MS: 1 CrossOver- 00:08:10.134 [2024-12-17 01:23:56.103762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.103787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.103863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:94949494 cdw11:94d49494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.103877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.103941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:94949494 cdw11:d4949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.103954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.104008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:94949494 cdw11:94949494 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.104021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.134 [2024-12-17 01:23:56.104073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:94949400 cdw11:fa13010a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.134 [2024-12-17 01:23:56.104086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.134 #39 NEW cov: 12424 ft: 14878 corp: 35/1305b lim: 40 exec/s: 19 rss: 74Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:10.134 #39 DONE cov: 12424 ft: 14878 corp: 35/1305b lim: 40 exec/s: 19 rss: 74Mb 00:08:10.134 ###### Recommended dictionary. ###### 00:08:10.134 "\001\000\000s" # Uses: 0 00:08:10.134 "\023\001" # Uses: 2 00:08:10.134 "\037\000\000\000" # Uses: 0 00:08:10.134 "\001\000\000\000\000\000\000\020" # Uses: 0 00:08:10.134 ###### End of recommended dictionary. ###### 00:08:10.134 Done 39 runs in 2 second(s) 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:10.394 01:23:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:10.394 [2024-12-17 01:23:56.286766] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:10.394 [2024-12-17 01:23:56.286857] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid828993 ] 00:08:10.655 [2024-12-17 01:23:56.542834] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.655 [2024-12-17 01:23:56.573486] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.655 [2024-12-17 01:23:56.625814] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.655 [2024-12-17 01:23:56.642143] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:10.655 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.655 INFO: Seed: 3156342989 00:08:10.916 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:10.916 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:10.916 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:10.916 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.916 #2 INITED exec/s: 0 rss: 65Mb 00:08:10.916 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.916 This may also happen if the target rejected all inputs we tried so far 00:08:10.916 [2024-12-17 01:23:56.701615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.916 [2024-12-17 01:23:56.701642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.916 [2024-12-17 01:23:56.701698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.916 [2024-12-17 01:23:56.701712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.916 [2024-12-17 01:23:56.701764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.916 [2024-12-17 01:23:56.701777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.174 NEW_FUNC[1/714]: 0x464cf8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:11.174 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.174 #15 NEW cov: 12185 ft: 12185 corp: 2/32b lim: 40 exec/s: 0 rss: 73Mb L: 31/31 MS: 3 InsertByte-CopyPart-InsertRepeatedBytes- 00:08:11.174 [2024-12-17 01:23:57.032653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.032695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.032775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.032800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.032864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.032882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.032946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f7f cdw11:7f0a5d00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.032964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.174 #16 NEW cov: 12298 ft: 13114 corp: 3/68b lim: 40 exec/s: 0 rss: 73Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:11.174 [2024-12-17 01:23:57.092497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.092522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.092577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.092590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.092644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.092656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.174 #17 NEW cov: 12304 ft: 13455 corp: 4/99b lim: 40 exec/s: 0 rss: 73Mb L: 31/36 MS: 1 ChangeByte- 00:08:11.174 [2024-12-17 01:23:57.132734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.132762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.132837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.132851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.132916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.132929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.132981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f3a cdw11:7f7f0a5d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.132997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.174 #18 NEW cov: 12389 ft: 13677 corp: 5/131b lim: 40 exec/s: 0 rss: 73Mb L: 32/36 MS: 1 CrossOver- 00:08:11.174 [2024-12-17 01:23:57.172734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.172758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.172814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.172844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.174 [2024-12-17 01:23:57.172897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.174 [2024-12-17 01:23:57.172910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.433 #19 NEW cov: 12389 ft: 13747 corp: 6/162b lim: 40 exec/s: 0 rss: 73Mb L: 31/36 MS: 1 ShuffleBytes- 00:08:11.433 [2024-12-17 01:23:57.232921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f1f007f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.433 [2024-12-17 01:23:57.232945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.433 [2024-12-17 01:23:57.233016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.433 [2024-12-17 01:23:57.233031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.433 [2024-12-17 01:23:57.233085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.433 [2024-12-17 01:23:57.233098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.433 #25 NEW cov: 12389 ft: 13862 corp: 7/193b lim: 40 exec/s: 0 rss: 73Mb L: 31/36 MS: 1 ChangeBinInt- 00:08:11.433 [2024-12-17 01:23:57.273112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f0a cdw11:7f212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.433 [2024-12-17 01:23:57.273137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.433 [2024-12-17 01:23:57.273192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.433 [2024-12-17 01:23:57.273206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.433 [2024-12-17 01:23:57.273259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.433 [2024-12-17 01:23:57.273272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.433 [2024-12-17 01:23:57.273324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.273337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.434 #28 NEW cov: 12389 ft: 13897 corp: 8/230b lim: 40 exec/s: 0 rss: 73Mb L: 37/37 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:08:11.434 [2024-12-17 01:23:57.313207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f0a cdw11:7f212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.313230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.434 [2024-12-17 01:23:57.313301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.313314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.434 [2024-12-17 01:23:57.313369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.313382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.434 [2024-12-17 01:23:57.313432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:21212125 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.313445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.434 #29 NEW cov: 12389 ft: 13924 corp: 9/267b lim: 40 exec/s: 0 rss: 73Mb L: 37/37 MS: 1 ChangeBit- 00:08:11.434 [2024-12-17 01:23:57.373261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f1f007f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.373285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.434 [2024-12-17 01:23:57.373356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.373370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.434 [2024-12-17 01:23:57.373428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.373441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.434 #30 NEW cov: 12389 ft: 13938 corp: 10/298b lim: 40 exec/s: 0 rss: 73Mb L: 31/37 MS: 1 ChangeByte- 00:08:11.434 [2024-12-17 01:23:57.433461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.433485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.434 [2024-12-17 01:23:57.433540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.433554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.434 [2024-12-17 01:23:57.433607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.434 [2024-12-17 01:23:57.433620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.692 #31 NEW cov: 12389 ft: 14015 corp: 11/329b lim: 40 exec/s: 0 rss: 73Mb L: 31/37 MS: 1 ShuffleBytes- 00:08:11.692 [2024-12-17 01:23:57.473288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.692 [2024-12-17 01:23:57.473312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.692 #32 NEW cov: 12389 ft: 14364 corp: 12/339b lim: 40 exec/s: 0 rss: 73Mb L: 10/37 MS: 1 CrossOver- 00:08:11.692 [2024-12-17 01:23:57.513784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.692 [2024-12-17 01:23:57.513813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.692 [2024-12-17 01:23:57.513869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.692 [2024-12-17 01:23:57.513882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.692 [2024-12-17 01:23:57.513936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f877f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.692 [2024-12-17 01:23:57.513948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.692 [2024-12-17 01:23:57.514001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f3a cdw11:7f7f0a5d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.692 [2024-12-17 01:23:57.514014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.692 #33 NEW cov: 12389 ft: 14411 corp: 13/371b lim: 40 exec/s: 0 rss: 74Mb L: 32/37 MS: 1 ChangeBinInt- 00:08:11.692 [2024-12-17 01:23:57.573817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.692 [2024-12-17 01:23:57.573842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.692 [2024-12-17 01:23:57.573912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.573926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.693 [2024-12-17 01:23:57.573980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.573993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.693 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:11.693 #34 NEW cov: 12412 ft: 14453 corp: 14/402b lim: 40 exec/s: 0 rss: 74Mb L: 31/37 MS: 1 ShuffleBytes- 00:08:11.693 [2024-12-17 01:23:57.614050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.614075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.693 [2024-12-17 01:23:57.614130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.614144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.693 [2024-12-17 01:23:57.614200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.614212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.693 [2024-12-17 01:23:57.614265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.614280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.693 #35 NEW cov: 12412 ft: 14496 corp: 15/440b lim: 40 exec/s: 0 rss: 74Mb L: 38/38 MS: 1 CopyPart- 00:08:11.693 [2024-12-17 01:23:57.653803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:000005c9 cdw11:5d25bb7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.653827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.693 #38 NEW cov: 12412 ft: 14513 corp: 16/449b lim: 40 exec/s: 0 rss: 74Mb L: 9/38 MS: 3 ShuffleBytes-CrossOver-CMP- DE: "\000\005\311]%\273|\252"- 00:08:11.693 [2024-12-17 01:23:57.694302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.694326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.693 [2024-12-17 01:23:57.694383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.694397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.693 [2024-12-17 01:23:57.694452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.694466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.693 [2024-12-17 01:23:57.694520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.693 [2024-12-17 01:23:57.694533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.951 #39 NEW cov: 12412 ft: 14542 corp: 17/488b lim: 40 exec/s: 39 rss: 74Mb L: 39/39 MS: 1 CopyPart- 00:08:11.951 [2024-12-17 01:23:57.754450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f0a cdw11:7f212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.754474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.754529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.754542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.754594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:21212121 cdw11:213f2121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.754607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.754662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.754675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.951 #40 NEW cov: 12412 ft: 14552 corp: 18/526b lim: 40 exec/s: 40 rss: 74Mb L: 38/39 MS: 1 InsertByte- 00:08:11.951 [2024-12-17 01:23:57.794469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f1f007f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.794497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.794570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.794584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.794638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.794651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.951 #41 NEW cov: 12412 ft: 14640 corp: 19/557b lim: 40 exec/s: 41 rss: 74Mb L: 31/39 MS: 1 ChangeBinInt- 00:08:11.951 [2024-12-17 01:23:57.834530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f1f007f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.834554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.834612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f86 cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.834625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.834694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.834708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.951 #42 NEW cov: 12412 ft: 14653 corp: 20/588b lim: 40 exec/s: 42 rss: 74Mb L: 31/39 MS: 1 ChangeBinInt- 00:08:11.951 [2024-12-17 01:23:57.894849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f4a cdw11:7f212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.894872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.894930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.894943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.894996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.895010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.951 [2024-12-17 01:23:57.895062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.895075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.951 #43 NEW cov: 12412 ft: 14666 corp: 21/625b lim: 40 exec/s: 43 rss: 74Mb L: 37/39 MS: 1 ChangeBit- 00:08:11.951 [2024-12-17 01:23:57.934538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000518 cdw11:5d25bb7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.951 [2024-12-17 01:23:57.934562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.209 #44 NEW cov: 12412 ft: 14702 corp: 22/634b lim: 40 exec/s: 44 rss: 74Mb L: 9/39 MS: 1 ChangeByte- 00:08:12.209 [2024-12-17 01:23:57.995112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.209 [2024-12-17 01:23:57.995137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.209 [2024-12-17 01:23:57.995193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.209 [2024-12-17 01:23:57.995206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.209 [2024-12-17 01:23:57.995259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:57.995272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.210 [2024-12-17 01:23:57.995326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f10007f cdw11:3a7f7f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:57.995339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.210 #45 NEW cov: 12412 ft: 14736 corp: 23/667b lim: 40 exec/s: 45 rss: 74Mb L: 33/39 MS: 1 CMP- DE: "\020\000"- 00:08:12.210 [2024-12-17 01:23:58.055046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.055070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.210 [2024-12-17 01:23:58.055127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.055140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.210 #46 NEW cov: 12412 ft: 14928 corp: 24/684b lim: 40 exec/s: 46 rss: 74Mb L: 17/39 MS: 1 CrossOver- 00:08:12.210 [2024-12-17 01:23:58.095392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.095416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.210 [2024-12-17 01:23:58.095471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.095484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.210 [2024-12-17 01:23:58.095554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f877f cdw11:5d7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.095567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.210 [2024-12-17 01:23:58.095622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f3a cdw11:7f7f0a5d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.095634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.210 #47 NEW cov: 12412 ft: 14951 corp: 25/716b lim: 40 exec/s: 47 rss: 74Mb L: 32/39 MS: 1 ChangeByte- 00:08:12.210 [2024-12-17 01:23:58.155430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f1f007f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.155454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.210 [2024-12-17 01:23:58.155529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f86 cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.155543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.210 [2024-12-17 01:23:58.155598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7fb67f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.210 [2024-12-17 01:23:58.155611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.210 #48 NEW cov: 12412 ft: 14963 corp: 26/747b lim: 40 exec/s: 48 rss: 74Mb L: 31/39 MS: 1 ChangeByte- 00:08:12.468 [2024-12-17 01:23:58.215611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.468 [2024-12-17 01:23:58.215636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.468 [2024-12-17 01:23:58.215693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.468 [2024-12-17 01:23:58.215707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.468 [2024-12-17 01:23:58.215760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:3a7f7f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.468 [2024-12-17 01:23:58.215773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.468 #49 NEW cov: 12412 ft: 14978 corp: 27/772b lim: 40 exec/s: 49 rss: 74Mb L: 25/39 MS: 1 EraseBytes- 00:08:12.468 [2024-12-17 01:23:58.255700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.468 [2024-12-17 01:23:58.255724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.468 [2024-12-17 01:23:58.255798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f3a7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.468 [2024-12-17 01:23:58.255812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.468 [2024-12-17 01:23:58.255881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:3a7f7f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.255894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.469 #50 NEW cov: 12412 ft: 14985 corp: 28/797b lim: 40 exec/s: 50 rss: 74Mb L: 25/39 MS: 1 CopyPart- 00:08:12.469 [2024-12-17 01:23:58.316023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.316048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.469 [2024-12-17 01:23:58.316118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.316132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.469 [2024-12-17 01:23:58.316187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f2100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.316200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.469 [2024-12-17 01:23:58.316259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f10007f cdw11:3a7f7f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.316272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.469 #51 NEW cov: 12412 ft: 15024 corp: 29/830b lim: 40 exec/s: 51 rss: 75Mb L: 33/39 MS: 1 ChangeBinInt- 00:08:12.469 [2024-12-17 01:23:58.376070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f3a7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.376094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.469 [2024-12-17 01:23:58.376164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.376178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.469 [2024-12-17 01:23:58.376232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.376245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.469 #52 NEW cov: 12412 ft: 15026 corp: 30/861b lim: 40 exec/s: 52 rss: 75Mb L: 31/39 MS: 1 ChangeByte- 00:08:12.469 [2024-12-17 01:23:58.436423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f1f007f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.436448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.469 [2024-12-17 01:23:58.436504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f0005 cdw11:c95d25bb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.436518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.469 [2024-12-17 01:23:58.436570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7caa7f86 cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.436583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.469 [2024-12-17 01:23:58.436637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f7f cdw11:7f7fb67f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.469 [2024-12-17 01:23:58.436650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.728 #53 NEW cov: 12412 ft: 15056 corp: 31/900b lim: 40 exec/s: 53 rss: 75Mb L: 39/39 MS: 1 PersAutoDict- DE: "\000\005\311]%\273|\252"- 00:08:12.728 [2024-12-17 01:23:58.496379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.496403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.496473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f1000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.496486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.496541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:3a7f7f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.496557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.556591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.556614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.556687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f1027 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.556700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.556755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:3a7f7f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.556769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.728 #55 NEW cov: 12412 ft: 15071 corp: 32/925b lim: 40 exec/s: 55 rss: 75Mb L: 25/39 MS: 2 PersAutoDict-ChangeByte- DE: "\020\000"- 00:08:12.728 [2024-12-17 01:23:58.596809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f4a cdw11:7f212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.596833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.596890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:21212121 cdw11:21212121 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.596903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.596957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:21212121 cdw11:21212100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.596970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.597026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:05c95d25 cdw11:bb7caa21 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.597038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.728 #56 NEW cov: 12412 ft: 15079 corp: 33/962b lim: 40 exec/s: 56 rss: 75Mb L: 37/39 MS: 1 PersAutoDict- DE: "\000\005\311]%\273|\252"- 00:08:12.728 [2024-12-17 01:23:58.656984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.657008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.657078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.657092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.657147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.657160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.657217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7f7f7f3a cdw11:7f7f0a5d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.657233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.728 #57 NEW cov: 12412 ft: 15085 corp: 34/994b lim: 40 exec/s: 57 rss: 75Mb L: 32/39 MS: 1 CrossOver- 00:08:12.728 [2024-12-17 01:23:58.696967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7f7f7f7f cdw11:10007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.696991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.697049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7f7f7f7f cdw11:7f7f7f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.697062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.728 [2024-12-17 01:23:58.697115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f7f7f7f cdw11:7f7f3a7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.728 [2024-12-17 01:23:58.697127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.728 #58 NEW cov: 12412 ft: 15095 corp: 35/1021b lim: 40 exec/s: 29 rss: 75Mb L: 27/39 MS: 1 PersAutoDict- DE: "\020\000"- 00:08:12.728 #58 DONE cov: 12412 ft: 15095 corp: 35/1021b lim: 40 exec/s: 29 rss: 75Mb 00:08:12.728 ###### Recommended dictionary. ###### 00:08:12.728 "\000\005\311]%\273|\252" # Uses: 2 00:08:12.728 "\020\000" # Uses: 2 00:08:12.728 ###### End of recommended dictionary. ###### 00:08:12.728 Done 58 runs in 2 second(s) 00:08:12.987 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:12.987 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:12.987 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.987 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:12.988 01:23:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:12.988 [2024-12-17 01:23:58.879868] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:12.988 [2024-12-17 01:23:58.879940] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid829370 ] 00:08:13.246 [2024-12-17 01:23:59.127401] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.246 [2024-12-17 01:23:59.156966] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.246 [2024-12-17 01:23:59.209395] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.246 [2024-12-17 01:23:59.225721] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:13.246 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.246 INFO: Seed: 1446374407 00:08:13.504 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:13.504 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:13.504 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:13.504 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.504 #2 INITED exec/s: 0 rss: 64Mb 00:08:13.504 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.504 This may also happen if the target rejected all inputs we tried so far 00:08:13.762 NEW_FUNC[1/702]: 0x4668c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:13.762 NEW_FUNC[2/702]: 0x487e18 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:13.762 #6 NEW cov: 12074 ft: 12073 corp: 2/11b lim: 35 exec/s: 0 rss: 72Mb L: 10/10 MS: 4 CopyPart-ChangeBit-CopyPart-CMP- DE: "\001\000\000\000\000\000\000\007"- 00:08:13.762 #7 NEW cov: 12187 ft: 12808 corp: 3/21b lim: 35 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 ChangeBinInt- 00:08:13.762 #13 NEW cov: 12193 ft: 13052 corp: 4/32b lim: 35 exec/s: 0 rss: 72Mb L: 11/11 MS: 1 CrossOver- 00:08:14.020 NEW_FUNC[1/2]: 0x4812a8 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:14.020 NEW_FUNC[2/2]: 0x13556e8 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1604 00:08:14.020 #14 NEW cov: 12335 ft: 13426 corp: 5/41b lim: 35 exec/s: 0 rss: 72Mb L: 9/11 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\007"- 00:08:14.020 [2024-12-17 01:23:59.833968] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES TIMESTAMP cid:4 cdw10:0000000e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.021 [2024-12-17 01:23:59.834012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.021 [2024-12-17 01:23:59.834149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.021 [2024-12-17 01:23:59.834168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.021 [2024-12-17 01:23:59.834298] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.021 [2024-12-17 01:23:59.834316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.021 NEW_FUNC[1/15]: 0x1921028 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:08:14.021 NEW_FUNC[2/15]: 0x1921268 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:08:14.021 #22 NEW cov: 12473 ft: 14300 corp: 6/65b lim: 35 exec/s: 0 rss: 72Mb L: 24/24 MS: 3 ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:14.021 #23 NEW cov: 12473 ft: 14356 corp: 7/76b lim: 35 exec/s: 0 rss: 72Mb L: 11/24 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\007"- 00:08:14.021 #24 NEW cov: 12473 ft: 14432 corp: 8/85b lim: 35 exec/s: 0 rss: 72Mb L: 9/24 MS: 1 ChangeBinInt- 00:08:14.278 #25 NEW cov: 12473 ft: 14497 corp: 9/95b lim: 35 exec/s: 0 rss: 72Mb L: 10/24 MS: 1 InsertByte- 00:08:14.278 #32 NEW cov: 12473 ft: 14547 corp: 10/102b lim: 35 exec/s: 0 rss: 72Mb L: 7/24 MS: 2 EraseBytes-InsertByte- 00:08:14.278 [2024-12-17 01:24:00.115220] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.278 [2024-12-17 01:24:00.115258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.278 [2024-12-17 01:24:00.115393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.278 [2024-12-17 01:24:00.115411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.278 [2024-12-17 01:24:00.115547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.278 [2024-12-17 01:24:00.115564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.278 #33 NEW cov: 12473 ft: 14776 corp: 11/133b lim: 35 exec/s: 0 rss: 72Mb L: 31/31 MS: 1 CrossOver- 00:08:14.278 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:14.278 #34 NEW cov: 12496 ft: 14838 corp: 12/142b lim: 35 exec/s: 0 rss: 72Mb L: 9/31 MS: 1 ChangeBinInt- 00:08:14.278 [2024-12-17 01:24:00.235565] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES TIMESTAMP cid:4 cdw10:0000000e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.278 [2024-12-17 01:24:00.235601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.278 [2024-12-17 01:24:00.235729] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.278 [2024-12-17 01:24:00.235759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.278 [2024-12-17 01:24:00.235893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.278 [2024-12-17 01:24:00.235912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.278 [2024-12-17 01:24:00.236037] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.278 [2024-12-17 01:24:00.236056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.278 #35 NEW cov: 12503 ft: 14998 corp: 13/174b lim: 35 exec/s: 35 rss: 73Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:14.536 #36 NEW cov: 12503 ft: 15011 corp: 14/185b lim: 35 exec/s: 36 rss: 73Mb L: 11/32 MS: 1 ChangeBit- 00:08:14.536 #37 NEW cov: 12503 ft: 15103 corp: 15/195b lim: 35 exec/s: 37 rss: 73Mb L: 10/32 MS: 1 InsertByte- 00:08:14.536 #38 NEW cov: 12503 ft: 15179 corp: 16/205b lim: 35 exec/s: 38 rss: 73Mb L: 10/32 MS: 1 EraseBytes- 00:08:14.536 [2024-12-17 01:24:00.495542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:4 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.536 [2024-12-17 01:24:00.495583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.536 #41 NEW cov: 12503 ft: 15363 corp: 17/217b lim: 35 exec/s: 41 rss: 73Mb L: 12/32 MS: 3 EraseBytes-ChangeByte-CMP- DE: "\367\310\344\312^\311\005\000"- 00:08:14.794 [2024-12-17 01:24:00.566265] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES TIMESTAMP cid:4 cdw10:0000000e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.794 [2024-12-17 01:24:00.566294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.794 [2024-12-17 01:24:00.566424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.794 [2024-12-17 01:24:00.566446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.794 [2024-12-17 01:24:00.566577] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.794 [2024-12-17 01:24:00.566597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.794 #42 NEW cov: 12503 ft: 15382 corp: 18/241b lim: 35 exec/s: 42 rss: 73Mb L: 24/32 MS: 1 CMP- DE: "\000\015"- 00:08:14.794 [2024-12-17 01:24:00.616313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NUMBER OF QUEUES cid:5 cdw10:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.794 [2024-12-17 01:24:00.616347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: COMMAND SEQUENCE ERROR (00/0c) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.794 NEW_FUNC[1/1]: 0x4858c8 in feat_number_of_queues /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:318 00:08:14.794 #43 NEW cov: 12537 ft: 15519 corp: 19/258b lim: 35 exec/s: 43 rss: 73Mb L: 17/32 MS: 1 CopyPart- 00:08:14.794 #44 NEW cov: 12537 ft: 15527 corp: 20/269b lim: 35 exec/s: 44 rss: 73Mb L: 11/32 MS: 1 ChangeBit- 00:08:14.794 [2024-12-17 01:24:00.737449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES TIMESTAMP cid:4 cdw10:0000000e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.794 [2024-12-17 01:24:00.737479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.794 [2024-12-17 01:24:00.737611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.794 [2024-12-17 01:24:00.737632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.794 [2024-12-17 01:24:00.737760] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.794 [2024-12-17 01:24:00.737782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.794 [2024-12-17 01:24:00.737916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.794 [2024-12-17 01:24:00.737936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.794 NEW_FUNC[1/2]: 0x484778 in feat_error_recover /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:304 00:08:14.794 NEW_FUNC[2/2]: 0x1358b98 in nvmf_ctrlr_set_features_error_recovery /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1724 00:08:14.794 #45 NEW cov: 12591 ft: 15711 corp: 21/304b lim: 35 exec/s: 45 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:08:15.053 #46 NEW cov: 12591 ft: 15720 corp: 22/314b lim: 35 exec/s: 46 rss: 73Mb L: 10/35 MS: 1 CrossOver- 00:08:15.053 NEW_FUNC[1/3]: 0x483598 in feat_temperature_threshold /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:295 00:08:15.053 NEW_FUNC[2/3]: 0x13518c8 in temp_threshold_opts_valid /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1641 00:08:15.053 #47 NEW cov: 12645 ft: 15778 corp: 23/333b lim: 35 exec/s: 47 rss: 73Mb L: 19/35 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\007"- 00:08:15.053 #48 NEW cov: 12645 ft: 15825 corp: 24/340b lim: 35 exec/s: 48 rss: 73Mb L: 7/35 MS: 1 ChangeBinInt- 00:08:15.053 #49 NEW cov: 12645 ft: 15828 corp: 25/349b lim: 35 exec/s: 49 rss: 73Mb L: 9/35 MS: 1 ChangeByte- 00:08:15.053 #50 NEW cov: 12645 ft: 15860 corp: 26/361b lim: 35 exec/s: 50 rss: 73Mb L: 12/35 MS: 1 CopyPart- 00:08:15.311 #51 NEW cov: 12645 ft: 15864 corp: 27/372b lim: 35 exec/s: 51 rss: 73Mb L: 11/35 MS: 1 CrossOver- 00:08:15.311 [2024-12-17 01:24:01.107083] ctrlr.c:1660:temp_threshold_opts_valid: *ERROR*: Invalid THSEL 2 00:08:15.311 [2024-12-17 01:24:01.107496] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES TEMPERATURE THRESHOLD cid:4 cdw10:00000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.311 [2024-12-17 01:24:01.107539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.311 #52 NEW cov: 12654 ft: 15948 corp: 28/385b lim: 35 exec/s: 52 rss: 73Mb L: 13/35 MS: 1 CopyPart- 00:08:15.311 [2024-12-17 01:24:01.157474] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.311 [2024-12-17 01:24:01.157510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.311 #57 NEW cov: 12654 ft: 15986 corp: 29/397b lim: 35 exec/s: 57 rss: 73Mb L: 12/35 MS: 5 PersAutoDict-CopyPart-InsertByte-ShuffleBytes-CMP- DE: "\000\015"-"\377\377\377\377\377\377\377\000"- 00:08:15.311 #58 NEW cov: 12654 ft: 16001 corp: 30/408b lim: 35 exec/s: 58 rss: 73Mb L: 11/35 MS: 1 ChangeBinInt- 00:08:15.311 [2024-12-17 01:24:01.258482] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.311 [2024-12-17 01:24:01.258514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.311 [2024-12-17 01:24:01.258647] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.311 [2024-12-17 01:24:01.258673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.311 #59 NEW cov: 12654 ft: 16119 corp: 31/432b lim: 35 exec/s: 29 rss: 73Mb L: 24/35 MS: 1 InsertRepeatedBytes- 00:08:15.311 #59 DONE cov: 12654 ft: 16119 corp: 31/432b lim: 35 exec/s: 29 rss: 73Mb 00:08:15.311 ###### Recommended dictionary. ###### 00:08:15.311 "\001\000\000\000\000\000\000\007" # Uses: 3 00:08:15.311 "\367\310\344\312^\311\005\000" # Uses: 0 00:08:15.311 "\000\015" # Uses: 1 00:08:15.311 "\377\377\377\377\377\377\377\000" # Uses: 0 00:08:15.311 ###### End of recommended dictionary. ###### 00:08:15.311 Done 59 runs in 2 second(s) 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:15.570 01:24:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:15.570 [2024-12-17 01:24:01.454625] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:15.570 [2024-12-17 01:24:01.454711] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid830009 ] 00:08:15.829 [2024-12-17 01:24:01.636447] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.829 [2024-12-17 01:24:01.658117] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.829 [2024-12-17 01:24:01.710536] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.829 [2024-12-17 01:24:01.726878] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:15.829 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.829 INFO: Seed: 3947375588 00:08:15.829 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:15.829 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:15.829 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:15.829 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.829 #2 INITED exec/s: 0 rss: 64Mb 00:08:15.829 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.829 This may also happen if the target rejected all inputs we tried so far 00:08:15.829 [2024-12-17 01:24:01.782743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.829 [2024-12-17 01:24:01.782773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.829 [2024-12-17 01:24:01.782836] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.829 [2024-12-17 01:24:01.782850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.829 [2024-12-17 01:24:01.782908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.829 [2024-12-17 01:24:01.782921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.087 NEW_FUNC[1/715]: 0x467e08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:16.087 NEW_FUNC[2/715]: 0x487e18 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:16.087 #3 NEW cov: 12181 ft: 12179 corp: 2/29b lim: 35 exec/s: 0 rss: 72Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:16.345 [2024-12-17 01:24:02.114892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.345 [2024-12-17 01:24:02.114943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.345 [2024-12-17 01:24:02.115090] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.345 [2024-12-17 01:24:02.115113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.345 [2024-12-17 01:24:02.115259] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.345 [2024-12-17 01:24:02.115283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.345 #4 NEW cov: 12294 ft: 13027 corp: 3/62b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CopyPart- 00:08:16.345 [2024-12-17 01:24:02.184041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.345 [2024-12-17 01:24:02.184073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.345 #7 NEW cov: 12300 ft: 13932 corp: 4/74b lim: 35 exec/s: 0 rss: 72Mb L: 12/33 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:16.345 [2024-12-17 01:24:02.244746] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.345 [2024-12-17 01:24:02.244778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.345 [2024-12-17 01:24:02.244916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.345 [2024-12-17 01:24:02.244935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.345 #9 NEW cov: 12385 ft: 14350 corp: 5/95b lim: 35 exec/s: 0 rss: 72Mb L: 21/33 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:16.345 [2024-12-17 01:24:02.294365] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.345 [2024-12-17 01:24:02.294393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.345 #10 NEW cov: 12385 ft: 14429 corp: 6/108b lim: 35 exec/s: 0 rss: 72Mb L: 13/33 MS: 1 CrossOver- 00:08:16.603 [2024-12-17 01:24:02.365050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000062 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.603 [2024-12-17 01:24:02.365080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.603 [2024-12-17 01:24:02.365222] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.603 [2024-12-17 01:24:02.365239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.603 [2024-12-17 01:24:02.365366] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.603 [2024-12-17 01:24:02.365386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.603 #20 NEW cov: 12385 ft: 14503 corp: 7/131b lim: 35 exec/s: 0 rss: 72Mb L: 23/33 MS: 5 ChangeBinInt-CopyPart-InsertByte-ChangeByte-CrossOver- 00:08:16.603 [2024-12-17 01:24:02.414757] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.603 [2024-12-17 01:24:02.414789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.603 #21 NEW cov: 12385 ft: 14545 corp: 8/144b lim: 35 exec/s: 0 rss: 72Mb L: 13/33 MS: 1 ShuffleBytes- 00:08:16.603 [2024-12-17 01:24:02.485120] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.603 [2024-12-17 01:24:02.485151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.603 [2024-12-17 01:24:02.485293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.603 [2024-12-17 01:24:02.485311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.603 #22 NEW cov: 12385 ft: 14672 corp: 9/158b lim: 35 exec/s: 0 rss: 72Mb L: 14/33 MS: 1 InsertByte- 00:08:16.603 [2024-12-17 01:24:02.555700] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.603 [2024-12-17 01:24:02.555732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.603 [2024-12-17 01:24:02.555880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.603 [2024-12-17 01:24:02.555900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.603 #23 NEW cov: 12385 ft: 14702 corp: 10/180b lim: 35 exec/s: 0 rss: 72Mb L: 22/33 MS: 1 InsertByte- 00:08:16.861 [2024-12-17 01:24:02.625547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-12-17 01:24:02.625576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.861 [2024-12-17 01:24:02.625708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-12-17 01:24:02.625726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.861 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:16.861 #24 NEW cov: 12408 ft: 14755 corp: 11/195b lim: 35 exec/s: 0 rss: 73Mb L: 15/33 MS: 1 CrossOver- 00:08:16.861 [2024-12-17 01:24:02.695605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-12-17 01:24:02.695632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.861 #25 NEW cov: 12408 ft: 14794 corp: 12/208b lim: 35 exec/s: 0 rss: 73Mb L: 13/33 MS: 1 ChangeBinInt- 00:08:16.861 [2024-12-17 01:24:02.746345] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-12-17 01:24:02.746373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.861 [2024-12-17 01:24:02.746511] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-12-17 01:24:02.746530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.861 #26 NEW cov: 12408 ft: 14846 corp: 13/234b lim: 35 exec/s: 0 rss: 73Mb L: 26/33 MS: 1 CrossOver- 00:08:16.861 [2024-12-17 01:24:02.796503] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-12-17 01:24:02.796529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.861 [2024-12-17 01:24:02.796656] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.861 [2024-12-17 01:24:02.796671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.861 #27 NEW cov: 12408 ft: 14868 corp: 14/261b lim: 35 exec/s: 27 rss: 73Mb L: 27/33 MS: 1 InsertByte- 00:08:17.120 [2024-12-17 01:24:02.866936] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:02.866963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.120 [2024-12-17 01:24:02.867106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000728 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:02.867126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.120 [2024-12-17 01:24:02.867268] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:02.867289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.120 #28 NEW cov: 12408 ft: 14879 corp: 15/289b lim: 35 exec/s: 28 rss: 73Mb L: 28/33 MS: 1 InsertByte- 00:08:17.120 [2024-12-17 01:24:02.936385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:02.936414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.120 #29 NEW cov: 12408 ft: 14895 corp: 16/299b lim: 35 exec/s: 29 rss: 73Mb L: 10/33 MS: 1 EraseBytes- 00:08:17.120 [2024-12-17 01:24:03.007114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:03.007143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.120 [2024-12-17 01:24:03.007293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000002ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:03.007312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.120 [2024-12-17 01:24:03.007447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:03.007464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.120 #30 NEW cov: 12408 ft: 14899 corp: 17/327b lim: 35 exec/s: 30 rss: 73Mb L: 28/33 MS: 1 InsertByte- 00:08:17.120 [2024-12-17 01:24:03.057322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:03.057349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.120 [2024-12-17 01:24:03.057487] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:03.057506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.120 #31 NEW cov: 12408 ft: 14928 corp: 18/348b lim: 35 exec/s: 31 rss: 73Mb L: 21/33 MS: 1 ChangeByte- 00:08:17.120 [2024-12-17 01:24:03.107194] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.120 [2024-12-17 01:24:03.107221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.379 #32 NEW cov: 12408 ft: 14942 corp: 19/365b lim: 35 exec/s: 32 rss: 73Mb L: 17/33 MS: 1 CrossOver- 00:08:17.379 [2024-12-17 01:24:03.177908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.177936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.379 [2024-12-17 01:24:03.178088] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.178107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.379 [2024-12-17 01:24:03.178235] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.178254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.379 #33 NEW cov: 12408 ft: 15013 corp: 20/395b lim: 35 exec/s: 33 rss: 73Mb L: 30/33 MS: 1 InsertRepeatedBytes- 00:08:17.379 [2024-12-17 01:24:03.227529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000038a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.227557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.379 [2024-12-17 01:24:03.227687] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.227704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.379 #35 NEW cov: 12408 ft: 15036 corp: 21/413b lim: 35 exec/s: 35 rss: 73Mb L: 18/33 MS: 2 ChangeBit-CrossOver- 00:08:17.379 [2024-12-17 01:24:03.277666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.277693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.379 [2024-12-17 01:24:03.277833] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.277851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.379 #36 NEW cov: 12408 ft: 15048 corp: 22/428b lim: 35 exec/s: 36 rss: 73Mb L: 15/33 MS: 1 InsertByte- 00:08:17.379 [2024-12-17 01:24:03.328325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.328352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.379 [2024-12-17 01:24:03.328489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.328509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.379 [2024-12-17 01:24:03.328651] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.379 [2024-12-17 01:24:03.328671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.379 #37 NEW cov: 12408 ft: 15085 corp: 23/459b lim: 35 exec/s: 37 rss: 73Mb L: 31/33 MS: 1 InsertByte- 00:08:17.637 [2024-12-17 01:24:03.398171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.637 [2024-12-17 01:24:03.398200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.637 #38 NEW cov: 12408 ft: 15093 corp: 24/473b lim: 35 exec/s: 38 rss: 73Mb L: 14/33 MS: 1 EraseBytes- 00:08:17.638 [2024-12-17 01:24:03.468221] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.468250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.638 [2024-12-17 01:24:03.468389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.468408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.638 #39 NEW cov: 12408 ft: 15127 corp: 25/488b lim: 35 exec/s: 39 rss: 73Mb L: 15/33 MS: 1 ChangeByte- 00:08:17.638 [2024-12-17 01:24:03.539054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.539083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.638 [2024-12-17 01:24:03.539225] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.539245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.638 [2024-12-17 01:24:03.539386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.539405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.638 #40 NEW cov: 12408 ft: 15169 corp: 26/516b lim: 35 exec/s: 40 rss: 73Mb L: 28/33 MS: 1 CopyPart- 00:08:17.638 [2024-12-17 01:24:03.589107] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000001d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.589136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.638 [2024-12-17 01:24:03.589272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000072a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.589292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.638 [2024-12-17 01:24:03.589432] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.589451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.638 [2024-12-17 01:24:03.589600] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.589620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.638 #41 NEW cov: 12408 ft: 15339 corp: 27/549b lim: 35 exec/s: 41 rss: 73Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:17.638 [2024-12-17 01:24:03.639197] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.639226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.638 [2024-12-17 01:24:03.639374] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.639394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.638 [2024-12-17 01:24:03.639544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.638 [2024-12-17 01:24:03.639564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.897 #42 NEW cov: 12408 ft: 15363 corp: 28/581b lim: 35 exec/s: 42 rss: 73Mb L: 32/33 MS: 1 InsertRepeatedBytes- 00:08:17.897 [2024-12-17 01:24:03.689476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-12-17 01:24:03.689508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.897 [2024-12-17 01:24:03.689645] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000728 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-12-17 01:24:03.689666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.897 [2024-12-17 01:24:03.689798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-12-17 01:24:03.689818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.897 #43 NEW cov: 12408 ft: 15380 corp: 29/610b lim: 35 exec/s: 43 rss: 73Mb L: 29/33 MS: 1 CopyPart- 00:08:17.897 [2024-12-17 01:24:03.759684] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-12-17 01:24:03.759714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.897 [2024-12-17 01:24:03.759840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000013f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-12-17 01:24:03.759858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.897 [2024-12-17 01:24:03.759991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005b0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.897 [2024-12-17 01:24:03.760010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.897 #44 NEW cov: 12408 ft: 15385 corp: 30/642b lim: 35 exec/s: 22 rss: 74Mb L: 32/33 MS: 1 CMP- DE: "\001\000?8"- 00:08:17.897 #44 DONE cov: 12408 ft: 15385 corp: 30/642b lim: 35 exec/s: 22 rss: 74Mb 00:08:17.897 ###### Recommended dictionary. ###### 00:08:17.897 "\001\000?8" # Uses: 0 00:08:17.897 ###### End of recommended dictionary. ###### 00:08:17.897 Done 44 runs in 2 second(s) 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.156 01:24:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:18.156 [2024-12-17 01:24:03.955549] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:18.156 [2024-12-17 01:24:03.955615] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid830592 ] 00:08:18.157 [2024-12-17 01:24:04.131524] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.157 [2024-12-17 01:24:04.153204] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.415 [2024-12-17 01:24:04.205666] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.415 [2024-12-17 01:24:04.222010] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:18.415 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.415 INFO: Seed: 2147418676 00:08:18.415 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:18.415 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:18.415 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:18.415 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.415 #2 INITED exec/s: 0 rss: 64Mb 00:08:18.415 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.415 This may also happen if the target rejected all inputs we tried so far 00:08:18.415 [2024-12-17 01:24:04.267494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129806 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.415 [2024-12-17 01:24:04.267525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.415 [2024-12-17 01:24:04.267561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.415 [2024-12-17 01:24:04.267578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.415 [2024-12-17 01:24:04.267632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.415 [2024-12-17 01:24:04.267648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.415 [2024-12-17 01:24:04.267701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.415 [2024-12-17 01:24:04.267716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.673 NEW_FUNC[1/715]: 0x4692c8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:18.673 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.673 #9 NEW cov: 12271 ft: 12265 corp: 2/94b lim: 105 exec/s: 0 rss: 72Mb L: 93/93 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:18.673 [2024-12-17 01:24:04.598441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-12-17 01:24:04.598476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.673 [2024-12-17 01:24:04.598523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-12-17 01:24:04.598541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.673 [2024-12-17 01:24:04.598600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-12-17 01:24:04.598618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.673 [2024-12-17 01:24:04.598677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-12-17 01:24:04.598697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.673 #10 NEW cov: 12384 ft: 12710 corp: 3/187b lim: 105 exec/s: 0 rss: 72Mb L: 93/93 MS: 1 ChangeByte- 00:08:18.673 [2024-12-17 01:24:04.658557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-12-17 01:24:04.658585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.673 [2024-12-17 01:24:04.658638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-12-17 01:24:04.658654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.673 [2024-12-17 01:24:04.658708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-12-17 01:24:04.658724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.673 [2024-12-17 01:24:04.658780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.673 [2024-12-17 01:24:04.658797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.932 #11 NEW cov: 12390 ft: 13148 corp: 4/280b lim: 105 exec/s: 0 rss: 72Mb L: 93/93 MS: 1 ChangeBinInt- 00:08:18.932 [2024-12-17 01:24:04.718709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.718737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.718789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:13136523261321064881 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.718808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.718863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.718878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.718935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.718951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.932 #17 NEW cov: 12475 ft: 13493 corp: 5/373b lim: 105 exec/s: 0 rss: 72Mb L: 93/93 MS: 1 ChangeBinInt- 00:08:18.932 [2024-12-17 01:24:04.758805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.758833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.758906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.758923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.758989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.759009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.759064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.759083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.932 #18 NEW cov: 12475 ft: 13570 corp: 6/473b lim: 105 exec/s: 0 rss: 72Mb L: 100/100 MS: 1 CrossOver- 00:08:18.932 [2024-12-17 01:24:04.818968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743790414614271 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.818996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.819045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.819062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.819118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.819134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.819191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.819205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.932 #19 NEW cov: 12475 ft: 13651 corp: 7/570b lim: 105 exec/s: 0 rss: 72Mb L: 97/100 MS: 1 InsertRepeatedBytes- 00:08:18.932 [2024-12-17 01:24:04.859076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.859103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.859176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642458714579291726 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.859191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.859248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.859265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.859321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.859337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.932 #20 NEW cov: 12475 ft: 13733 corp: 8/670b lim: 105 exec/s: 0 rss: 72Mb L: 100/100 MS: 1 CrossOver- 00:08:18.932 [2024-12-17 01:24:04.919267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.919296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.919345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642458714579291726 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.932 [2024-12-17 01:24:04.919365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.932 [2024-12-17 01:24:04.919420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.933 [2024-12-17 01:24:04.919433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.933 [2024-12-17 01:24:04.919488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.933 [2024-12-17 01:24:04.919504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.191 #21 NEW cov: 12475 ft: 13807 corp: 9/770b lim: 105 exec/s: 0 rss: 73Mb L: 100/100 MS: 1 ChangeBit- 00:08:19.191 [2024-12-17 01:24:04.979423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:04.979450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:04.979503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:19970 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:04.979519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:04.979573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:04.979588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:04.979646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:04.979661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.191 #22 NEW cov: 12475 ft: 13862 corp: 10/863b lim: 105 exec/s: 0 rss: 73Mb L: 93/100 MS: 1 CMP- DE: "\001\000\000\000\000\000\003\357"- 00:08:19.191 [2024-12-17 01:24:05.019534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.019563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.019635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:19970 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.019652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.019709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.019724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.019781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.019803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.191 #23 NEW cov: 12475 ft: 13928 corp: 11/956b lim: 105 exec/s: 0 rss: 73Mb L: 93/100 MS: 1 ChangeBit- 00:08:19.191 [2024-12-17 01:24:05.079688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.079715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.079786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:19970 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.079806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.079863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.079880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.079939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.079955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.191 #24 NEW cov: 12475 ft: 13972 corp: 12/1055b lim: 105 exec/s: 0 rss: 73Mb L: 99/100 MS: 1 CrossOver- 00:08:19.191 [2024-12-17 01:24:05.119659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.119686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.119752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.119770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.119824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.119840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.191 #25 NEW cov: 12475 ft: 14528 corp: 13/1134b lim: 105 exec/s: 0 rss: 73Mb L: 79/100 MS: 1 InsertRepeatedBytes- 00:08:19.191 [2024-12-17 01:24:05.159897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.159925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.159979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:19970 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.159994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.160049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.160064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.191 [2024-12-17 01:24:05.160119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.191 [2024-12-17 01:24:05.160134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.449 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:19.450 #26 NEW cov: 12498 ft: 14574 corp: 14/1227b lim: 105 exec/s: 0 rss: 73Mb L: 93/100 MS: 1 ChangeBit- 00:08:19.450 [2024-12-17 01:24:05.220082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.220110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.220182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.220199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.220255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.220271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.220327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.220342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.450 #27 NEW cov: 12498 ft: 14597 corp: 15/1321b lim: 105 exec/s: 0 rss: 73Mb L: 94/100 MS: 1 InsertByte- 00:08:19.450 [2024-12-17 01:24:05.260229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743790414614271 len:18767 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.260257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.260329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.260344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.260400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.260416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.260478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.260493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.450 #28 NEW cov: 12498 ft: 14653 corp: 16/1418b lim: 105 exec/s: 28 rss: 73Mb L: 97/100 MS: 1 ChangeBinInt- 00:08:19.450 [2024-12-17 01:24:05.320355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129806 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.320383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.320451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.320468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.320523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.320539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.320601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.320618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.450 #29 NEW cov: 12498 ft: 14673 corp: 17/1522b lim: 105 exec/s: 29 rss: 73Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:19.450 [2024-12-17 01:24:05.360446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129806 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.360474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.360530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.360544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.360599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.360614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.360671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533480056226799 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.360687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.450 #30 NEW cov: 12498 ft: 14702 corp: 18/1615b lim: 105 exec/s: 30 rss: 73Mb L: 93/104 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\003\357"- 00:08:19.450 [2024-12-17 01:24:05.400616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.400644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.400699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:19970 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.400715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.400788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.400809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.400865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.400881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.450 #31 NEW cov: 12498 ft: 14721 corp: 19/1708b lim: 105 exec/s: 31 rss: 73Mb L: 93/104 MS: 1 ChangeBinInt- 00:08:19.450 [2024-12-17 01:24:05.440674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.440702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.440759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.440777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.440852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4325048844288 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.440868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 [2024-12-17 01:24:05.440926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.450 [2024-12-17 01:24:05.440941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.708 #32 NEW cov: 12498 ft: 14730 corp: 20/1801b lim: 105 exec/s: 32 rss: 73Mb L: 93/104 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\003\357"- 00:08:19.708 [2024-12-17 01:24:05.480856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743790414614271 len:18767 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.480884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.480940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.480954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.481010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.481026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.481082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.481097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.708 #33 NEW cov: 12498 ft: 14772 corp: 21/1898b lim: 105 exec/s: 33 rss: 73Mb L: 97/104 MS: 1 ShuffleBytes- 00:08:19.708 [2024-12-17 01:24:05.541026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.541054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.541111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.541126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.541182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.541197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.541253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.541269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.708 #34 NEW cov: 12498 ft: 14794 corp: 22/1992b lim: 105 exec/s: 34 rss: 73Mb L: 94/104 MS: 1 ShuffleBytes- 00:08:19.708 [2024-12-17 01:24:05.601138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.601167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.601227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:13136523261321064881 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.601241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.601298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.601314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.708 [2024-12-17 01:24:05.601370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.708 [2024-12-17 01:24:05.601387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.708 #35 NEW cov: 12498 ft: 14804 corp: 23/2086b lim: 105 exec/s: 35 rss: 73Mb L: 94/104 MS: 1 InsertByte- 00:08:19.708 [2024-12-17 01:24:05.661408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129806 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.709 [2024-12-17 01:24:05.661435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.709 [2024-12-17 01:24:05.661494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.709 [2024-12-17 01:24:05.661507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.709 [2024-12-17 01:24:05.661579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.709 [2024-12-17 01:24:05.661594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.709 [2024-12-17 01:24:05.661650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.709 [2024-12-17 01:24:05.661666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.709 [2024-12-17 01:24:05.661725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.709 [2024-12-17 01:24:05.661741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.709 #36 NEW cov: 12498 ft: 14865 corp: 24/2191b lim: 105 exec/s: 36 rss: 73Mb L: 105/105 MS: 1 CrossOver- 00:08:19.967 [2024-12-17 01:24:05.721450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129806 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.721477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.967 [2024-12-17 01:24:05.721531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.721546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.967 [2024-12-17 01:24:05.721600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.721618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.967 [2024-12-17 01:24:05.721674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.721688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.967 #37 NEW cov: 12498 ft: 14888 corp: 25/2284b lim: 105 exec/s: 37 rss: 73Mb L: 93/105 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\003\357"- 00:08:19.967 [2024-12-17 01:24:05.761524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.761550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.967 [2024-12-17 01:24:05.761606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.761622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.967 [2024-12-17 01:24:05.761679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.761694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.967 [2024-12-17 01:24:05.761750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.761766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.967 #38 NEW cov: 12498 ft: 14901 corp: 26/2384b lim: 105 exec/s: 38 rss: 73Mb L: 100/105 MS: 1 CopyPart- 00:08:19.967 [2024-12-17 01:24:05.801809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129806 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.967 [2024-12-17 01:24:05.801836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.801910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.801926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.801983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:19969 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.801998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.802055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533480061357646 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.802070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.802125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.802141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.968 #39 NEW cov: 12498 ft: 14935 corp: 27/2489b lim: 105 exec/s: 39 rss: 73Mb L: 105/105 MS: 1 CopyPart- 00:08:19.968 [2024-12-17 01:24:05.841869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.841896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.841967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642458714579291726 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.841984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.842039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.842053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.842109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446548696972656639 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.842123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.842182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.842198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.968 #40 NEW cov: 12498 ft: 14959 corp: 28/2594b lim: 105 exec/s: 40 rss: 74Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:08:19.968 [2024-12-17 01:24:05.901565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9331882294115402113 len:33154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.901592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.968 #41 NEW cov: 12498 ft: 15432 corp: 29/2623b lim: 105 exec/s: 41 rss: 74Mb L: 29/105 MS: 1 InsertRepeatedBytes- 00:08:19.968 [2024-12-17 01:24:05.942210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743790414614271 len:18767 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.942237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.942294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:13352700443276982713 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.942310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.942380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.942396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.942453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.942469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.968 [2024-12-17 01:24:05.942526] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.968 [2024-12-17 01:24:05.942541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.968 #42 NEW cov: 12498 ft: 15445 corp: 30/2728b lim: 105 exec/s: 42 rss: 74Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:08:20.226 [2024-12-17 01:24:05.982287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:05.982316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:05.982388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:05.982404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:05.982463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4325048844288 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:05.982479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:05.982536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:05.982551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.226 #43 NEW cov: 12498 ft: 15449 corp: 31/2829b lim: 105 exec/s: 43 rss: 74Mb L: 101/105 MS: 1 InsertRepeatedBytes- 00:08:20.226 [2024-12-17 01:24:06.042416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743790414614271 len:18767 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:06.042443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:06.042492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:06.042509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:06.042566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:06.042580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:06.042640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533482094218830 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:06.042654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.226 #44 NEW cov: 12498 ft: 15464 corp: 32/2931b lim: 105 exec/s: 44 rss: 74Mb L: 102/105 MS: 1 InsertRepeatedBytes- 00:08:20.226 [2024-12-17 01:24:06.082426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:06.082454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:06.082508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:06.082524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:06.082581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5623393182953655886 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:06.082596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.226 [2024-12-17 01:24:06.082657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.226 [2024-12-17 01:24:06.082672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.226 #45 NEW cov: 12498 ft: 15484 corp: 33/3031b lim: 105 exec/s: 45 rss: 74Mb L: 100/105 MS: 1 CopyPart- 00:08:20.226 [2024-12-17 01:24:06.142465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.227 [2024-12-17 01:24:06.142492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.227 [2024-12-17 01:24:06.142560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.227 [2024-12-17 01:24:06.142576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.227 [2024-12-17 01:24:06.142635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.227 [2024-12-17 01:24:06.142650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.227 #46 NEW cov: 12498 ft: 15494 corp: 34/3103b lim: 105 exec/s: 46 rss: 74Mb L: 72/105 MS: 1 EraseBytes- 00:08:20.227 [2024-12-17 01:24:06.202784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:3073 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.227 [2024-12-17 01:24:06.202816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.227 [2024-12-17 01:24:06.202887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481906852430 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.227 [2024-12-17 01:24:06.202902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.227 [2024-12-17 01:24:06.202959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:283531398462046208 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.227 [2024-12-17 01:24:06.202974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.227 [2024-12-17 01:24:06.203033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.227 [2024-12-17 01:24:06.203048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.485 #52 NEW cov: 12498 ft: 15512 corp: 35/3204b lim: 105 exec/s: 52 rss: 74Mb L: 101/105 MS: 1 CMP- DE: "\014\000\000\000\000\000\000\000"- 00:08:20.485 [2024-12-17 01:24:06.262951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5642533480229129918 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.485 [2024-12-17 01:24:06.262979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.485 [2024-12-17 01:24:06.263051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.485 [2024-12-17 01:24:06.263067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.485 [2024-12-17 01:24:06.263124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:5642533481369980494 len:20047 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.485 [2024-12-17 01:24:06.263140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.485 [2024-12-17 01:24:06.263201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5642533481369980494 len:19969 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.485 [2024-12-17 01:24:06.263223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.485 #53 NEW cov: 12498 ft: 15516 corp: 36/3308b lim: 105 exec/s: 26 rss: 74Mb L: 104/105 MS: 1 InsertRepeatedBytes- 00:08:20.485 #53 DONE cov: 12498 ft: 15516 corp: 36/3308b lim: 105 exec/s: 26 rss: 74Mb 00:08:20.485 ###### Recommended dictionary. ###### 00:08:20.485 "\001\000\000\000\000\000\003\357" # Uses: 3 00:08:20.485 "\014\000\000\000\000\000\000\000" # Uses: 0 00:08:20.485 ###### End of recommended dictionary. ###### 00:08:20.485 Done 53 runs in 2 second(s) 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:20.485 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:20.486 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.486 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.486 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:20.486 01:24:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:20.486 [2024-12-17 01:24:06.433760] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:20.486 [2024-12-17 01:24:06.433836] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid831281 ] 00:08:20.744 [2024-12-17 01:24:06.615250] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.744 [2024-12-17 01:24:06.636464] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.744 [2024-12-17 01:24:06.688785] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.744 [2024-12-17 01:24:06.705127] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:20.744 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.744 INFO: Seed: 333435286 00:08:20.744 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:20.744 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:20.744 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:20.744 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.744 #2 INITED exec/s: 0 rss: 64Mb 00:08:20.744 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.744 This may also happen if the target rejected all inputs we tried so far 00:08:21.002 [2024-12-17 01:24:06.775016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.002 [2024-12-17 01:24:06.775059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.002 [2024-12-17 01:24:06.775199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.002 [2024-12-17 01:24:06.775226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.002 [2024-12-17 01:24:06.775357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.002 [2024-12-17 01:24:06.775386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.002 [2024-12-17 01:24:06.775521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.002 [2024-12-17 01:24:06.775549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.260 NEW_FUNC[1/716]: 0x46c648 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:21.260 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.260 #16 NEW cov: 12292 ft: 12281 corp: 2/97b lim: 120 exec/s: 0 rss: 72Mb L: 96/96 MS: 4 InsertByte-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:21.260 [2024-12-17 01:24:07.125731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.260 [2024-12-17 01:24:07.125785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.260 [2024-12-17 01:24:07.125935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.260 [2024-12-17 01:24:07.125967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.260 [2024-12-17 01:24:07.126086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.260 [2024-12-17 01:24:07.126117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.260 [2024-12-17 01:24:07.126247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.260 [2024-12-17 01:24:07.126277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.260 #17 NEW cov: 12405 ft: 12964 corp: 3/193b lim: 120 exec/s: 0 rss: 72Mb L: 96/96 MS: 1 ChangeByte- 00:08:21.260 [2024-12-17 01:24:07.195768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.260 [2024-12-17 01:24:07.195812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.260 [2024-12-17 01:24:07.195943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.260 [2024-12-17 01:24:07.195975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.260 [2024-12-17 01:24:07.196098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.261 [2024-12-17 01:24:07.196123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.261 [2024-12-17 01:24:07.196243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.261 [2024-12-17 01:24:07.196262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.261 #18 NEW cov: 12411 ft: 13266 corp: 4/289b lim: 120 exec/s: 0 rss: 72Mb L: 96/96 MS: 1 ShuffleBytes- 00:08:21.519 [2024-12-17 01:24:07.265887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-12-17 01:24:07.265922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.519 [2024-12-17 01:24:07.266025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-12-17 01:24:07.266048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.519 [2024-12-17 01:24:07.266165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-12-17 01:24:07.266186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.519 #24 NEW cov: 12496 ft: 13825 corp: 5/384b lim: 120 exec/s: 0 rss: 72Mb L: 95/96 MS: 1 EraseBytes- 00:08:21.519 [2024-12-17 01:24:07.336258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-12-17 01:24:07.336291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.519 [2024-12-17 01:24:07.336402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-12-17 01:24:07.336426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.519 [2024-12-17 01:24:07.336544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-12-17 01:24:07.336565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.519 [2024-12-17 01:24:07.336678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.519 [2024-12-17 01:24:07.336701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.519 #25 NEW cov: 12496 ft: 13929 corp: 6/480b lim: 120 exec/s: 0 rss: 72Mb L: 96/96 MS: 1 InsertByte- 00:08:21.520 [2024-12-17 01:24:07.406532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.406563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.406668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.406693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.406815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.406837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.406946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.406966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.520 #26 NEW cov: 12496 ft: 13984 corp: 7/577b lim: 120 exec/s: 0 rss: 72Mb L: 97/97 MS: 1 InsertByte- 00:08:21.520 [2024-12-17 01:24:07.456547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.456576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.456651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.456670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.456789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.456816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.456931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.456955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.520 #27 NEW cov: 12496 ft: 14039 corp: 8/673b lim: 120 exec/s: 0 rss: 72Mb L: 96/97 MS: 1 ShuffleBytes- 00:08:21.520 [2024-12-17 01:24:07.506800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.506844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.506928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.506956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.507076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709518847 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.507103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.520 [2024-12-17 01:24:07.507227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.520 [2024-12-17 01:24:07.507255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.778 #28 NEW cov: 12496 ft: 14098 corp: 9/769b lim: 120 exec/s: 0 rss: 73Mb L: 96/97 MS: 1 ChangeBit- 00:08:21.778 [2024-12-17 01:24:07.576976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.577011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.577088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.577117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.577230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.577255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.577375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073695789055 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.577401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.779 #29 NEW cov: 12496 ft: 14126 corp: 10/865b lim: 120 exec/s: 0 rss: 73Mb L: 96/97 MS: 1 CopyPart- 00:08:21.779 [2024-12-17 01:24:07.627142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.627171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.627260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.627284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.627402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.627423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.627542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.627563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.779 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:21.779 #30 NEW cov: 12519 ft: 14198 corp: 11/969b lim: 120 exec/s: 0 rss: 73Mb L: 104/104 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\020"- 00:08:21.779 [2024-12-17 01:24:07.677306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.677336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.677397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.677431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.677559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.677586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.677702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.677727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.779 #31 NEW cov: 12519 ft: 14294 corp: 12/1065b lim: 120 exec/s: 0 rss: 73Mb L: 96/104 MS: 1 ChangeBinInt- 00:08:21.779 [2024-12-17 01:24:07.727229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.727265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.727358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.727381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.779 [2024-12-17 01:24:07.727506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.779 [2024-12-17 01:24:07.727534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.779 #32 NEW cov: 12519 ft: 14423 corp: 13/1155b lim: 120 exec/s: 32 rss: 73Mb L: 90/104 MS: 1 EraseBytes- 00:08:22.038 [2024-12-17 01:24:07.797384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:6438275381263423833 len:22874 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.797418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.797518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:6438275382588823897 len:22874 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.797544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.797668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:6438275382588823897 len:22874 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.797697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.038 #33 NEW cov: 12519 ft: 14452 corp: 14/1227b lim: 120 exec/s: 33 rss: 73Mb L: 72/104 MS: 1 InsertRepeatedBytes- 00:08:22.038 [2024-12-17 01:24:07.847860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.847891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.847959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.847985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.848108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.848130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.848240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.848269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.038 #34 NEW cov: 12519 ft: 14488 corp: 15/1327b lim: 120 exec/s: 34 rss: 73Mb L: 100/104 MS: 1 CMP- DE: "\014\000\000\000"- 00:08:22.038 [2024-12-17 01:24:07.898032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.898062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.898137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.898163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.898283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.898302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.898421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073695789055 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.898445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.038 #40 NEW cov: 12519 ft: 14494 corp: 16/1423b lim: 120 exec/s: 40 rss: 73Mb L: 96/104 MS: 1 ShuffleBytes- 00:08:22.038 [2024-12-17 01:24:07.968229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2089670225673846783 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.968257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.968349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.968377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.968497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.968536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:07.968661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:07.968690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.038 #41 NEW cov: 12519 ft: 14555 corp: 17/1520b lim: 120 exec/s: 41 rss: 73Mb L: 97/104 MS: 1 InsertByte- 00:08:22.038 [2024-12-17 01:24:08.018421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:08.018451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:08.018519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:08.018541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:08.018665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:08.018690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.038 [2024-12-17 01:24:08.018805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073695789055 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.038 [2024-12-17 01:24:08.018828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.038 #42 NEW cov: 12519 ft: 14580 corp: 18/1616b lim: 120 exec/s: 42 rss: 73Mb L: 96/104 MS: 1 ShuffleBytes- 00:08:22.297 [2024-12-17 01:24:08.068465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.068494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.068563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.068591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.068715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709518847 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.068742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.068868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.068893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.297 #43 NEW cov: 12519 ft: 14600 corp: 19/1729b lim: 120 exec/s: 43 rss: 73Mb L: 113/113 MS: 1 InsertRepeatedBytes- 00:08:22.297 [2024-12-17 01:24:08.138730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.138762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.138836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.138861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.138976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709518847 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.138998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.139123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.139145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.297 #44 NEW cov: 12519 ft: 14613 corp: 20/1825b lim: 120 exec/s: 44 rss: 73Mb L: 96/113 MS: 1 ShuffleBytes- 00:08:22.297 [2024-12-17 01:24:08.188301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65451 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.188334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.188464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551405 len:65314 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.188492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.297 #45 NEW cov: 12519 ft: 14975 corp: 21/1891b lim: 120 exec/s: 45 rss: 73Mb L: 66/113 MS: 1 CrossOver- 00:08:22.297 [2024-12-17 01:24:08.239124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.239161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.239281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.239310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.239433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.239462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.297 [2024-12-17 01:24:08.239588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073695789055 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.297 [2024-12-17 01:24:08.239612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.297 #46 NEW cov: 12519 ft: 14990 corp: 22/1987b lim: 120 exec/s: 46 rss: 73Mb L: 96/113 MS: 1 ChangeBinInt- 00:08:22.556 [2024-12-17 01:24:08.309346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.556 [2024-12-17 01:24:08.309379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.556 [2024-12-17 01:24:08.309441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18378064183687118847 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.556 [2024-12-17 01:24:08.309464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.556 [2024-12-17 01:24:08.309589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.556 [2024-12-17 01:24:08.309613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.556 [2024-12-17 01:24:08.309742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.556 [2024-12-17 01:24:08.309770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.556 #47 NEW cov: 12519 ft: 15021 corp: 23/2091b lim: 120 exec/s: 47 rss: 73Mb L: 104/113 MS: 1 PersAutoDict- DE: "\014\000\000\000"- 00:08:22.556 [2024-12-17 01:24:08.379462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.556 [2024-12-17 01:24:08.379494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.556 [2024-12-17 01:24:08.379583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.379613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.379740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.379769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.379892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.379917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.557 #48 NEW cov: 12519 ft: 15056 corp: 24/2195b lim: 120 exec/s: 48 rss: 73Mb L: 104/113 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\020"- 00:08:22.557 [2024-12-17 01:24:08.429676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.429707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.429782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.429811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.429942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.429969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.430097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.430119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.557 #49 NEW cov: 12519 ft: 15079 corp: 25/2300b lim: 120 exec/s: 49 rss: 73Mb L: 105/113 MS: 1 CopyPart- 00:08:22.557 [2024-12-17 01:24:08.479457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.479493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.479610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9223372036854775807 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.479632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.479758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.479781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.557 #50 NEW cov: 12519 ft: 15086 corp: 26/2374b lim: 120 exec/s: 50 rss: 73Mb L: 74/113 MS: 1 EraseBytes- 00:08:22.557 [2024-12-17 01:24:08.529854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.529885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.529960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.529983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.530105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.530135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.557 [2024-12-17 01:24:08.530253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.557 [2024-12-17 01:24:08.530280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.816 #51 NEW cov: 12519 ft: 15140 corp: 27/2479b lim: 120 exec/s: 51 rss: 73Mb L: 105/113 MS: 1 ChangeByte- 00:08:22.816 [2024-12-17 01:24:08.600088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.600125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.600211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069985009663 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.600239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.600361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.600386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.600513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.600541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.816 #52 NEW cov: 12519 ft: 15142 corp: 28/2575b lim: 120 exec/s: 52 rss: 73Mb L: 96/113 MS: 1 ShuffleBytes- 00:08:22.816 [2024-12-17 01:24:08.650292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.650324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.650409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.650439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.650557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.650579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.650711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.650740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.816 #53 NEW cov: 12519 ft: 15154 corp: 29/2675b lim: 120 exec/s: 53 rss: 73Mb L: 100/113 MS: 1 CopyPart- 00:08:22.816 [2024-12-17 01:24:08.700504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283466495 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.700545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.700662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709494783 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.700692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.700813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.700835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.700957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.700977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.816 #54 NEW cov: 12519 ft: 15163 corp: 30/2775b lim: 120 exec/s: 54 rss: 73Mb L: 100/113 MS: 1 CrossOver- 00:08:22.816 [2024-12-17 01:24:08.750174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744072283488255 len:65451 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.750206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.816 [2024-12-17 01:24:08.750315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551405 len:65314 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.816 [2024-12-17 01:24:08.750343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.816 #55 NEW cov: 12519 ft: 15253 corp: 31/2841b lim: 120 exec/s: 27 rss: 74Mb L: 66/113 MS: 1 ShuffleBytes- 00:08:22.816 #55 DONE cov: 12519 ft: 15253 corp: 31/2841b lim: 120 exec/s: 27 rss: 74Mb 00:08:22.816 ###### Recommended dictionary. ###### 00:08:22.816 "\001\000\000\000\000\000\000\020" # Uses: 1 00:08:22.816 "\014\000\000\000" # Uses: 1 00:08:22.816 ###### End of recommended dictionary. ###### 00:08:22.816 Done 55 runs in 2 second(s) 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.075 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.076 01:24:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:23.076 [2024-12-17 01:24:08.941372] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:23.076 [2024-12-17 01:24:08.941441] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid831816 ] 00:08:23.334 [2024-12-17 01:24:09.114630] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.334 [2024-12-17 01:24:09.136115] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.334 [2024-12-17 01:24:09.188480] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.334 [2024-12-17 01:24:09.204770] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:23.334 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.334 INFO: Seed: 2833432756 00:08:23.334 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:23.334 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:23.334 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:23.334 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.334 #2 INITED exec/s: 0 rss: 64Mb 00:08:23.334 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.334 This may also happen if the target rejected all inputs we tried so far 00:08:23.334 [2024-12-17 01:24:09.253219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.334 [2024-12-17 01:24:09.253248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.593 NEW_FUNC[1/714]: 0x46ff38 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:23.593 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.593 #6 NEW cov: 12232 ft: 12228 corp: 2/25b lim: 100 exec/s: 0 rss: 72Mb L: 24/24 MS: 4 CopyPart-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:23.593 [2024-12-17 01:24:09.584133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.593 [2024-12-17 01:24:09.584171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.851 #7 NEW cov: 12348 ft: 12889 corp: 3/49b lim: 100 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 ChangeBit- 00:08:23.851 [2024-12-17 01:24:09.644355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.851 [2024-12-17 01:24:09.644382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.851 [2024-12-17 01:24:09.644433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:23.851 [2024-12-17 01:24:09.644446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.851 #8 NEW cov: 12354 ft: 13622 corp: 4/97b lim: 100 exec/s: 0 rss: 72Mb L: 48/48 MS: 1 CopyPart- 00:08:23.851 [2024-12-17 01:24:09.684297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.851 [2024-12-17 01:24:09.684322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.851 #9 NEW cov: 12439 ft: 13884 corp: 5/121b lim: 100 exec/s: 0 rss: 72Mb L: 24/48 MS: 1 ChangeBit- 00:08:23.851 [2024-12-17 01:24:09.744497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.851 [2024-12-17 01:24:09.744524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.851 #10 NEW cov: 12439 ft: 13992 corp: 6/142b lim: 100 exec/s: 0 rss: 72Mb L: 21/48 MS: 1 EraseBytes- 00:08:23.851 [2024-12-17 01:24:09.784577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.851 [2024-12-17 01:24:09.784603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.851 #11 NEW cov: 12439 ft: 14025 corp: 7/163b lim: 100 exec/s: 0 rss: 72Mb L: 21/48 MS: 1 CopyPart- 00:08:23.851 [2024-12-17 01:24:09.844763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:23.851 [2024-12-17 01:24:09.844789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.109 #12 NEW cov: 12439 ft: 14056 corp: 8/186b lim: 100 exec/s: 0 rss: 72Mb L: 23/48 MS: 1 CopyPart- 00:08:24.109 [2024-12-17 01:24:09.904917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.109 [2024-12-17 01:24:09.904943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.109 #13 NEW cov: 12439 ft: 14098 corp: 9/209b lim: 100 exec/s: 0 rss: 72Mb L: 23/48 MS: 1 CrossOver- 00:08:24.109 [2024-12-17 01:24:09.945144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.109 [2024-12-17 01:24:09.945171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.109 [2024-12-17 01:24:09.945216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.109 [2024-12-17 01:24:09.945231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.109 #14 NEW cov: 12439 ft: 14144 corp: 10/262b lim: 100 exec/s: 0 rss: 73Mb L: 53/53 MS: 1 InsertRepeatedBytes- 00:08:24.109 [2024-12-17 01:24:10.005967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.109 [2024-12-17 01:24:10.005996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.109 [2024-12-17 01:24:10.006041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.109 [2024-12-17 01:24:10.006057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.109 #15 NEW cov: 12439 ft: 14175 corp: 11/311b lim: 100 exec/s: 0 rss: 73Mb L: 49/53 MS: 1 InsertByte- 00:08:24.109 [2024-12-17 01:24:10.065524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.109 [2024-12-17 01:24:10.065554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.109 [2024-12-17 01:24:10.065604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.109 [2024-12-17 01:24:10.065620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.109 #16 NEW cov: 12439 ft: 14238 corp: 12/360b lim: 100 exec/s: 0 rss: 73Mb L: 49/53 MS: 1 ChangeByte- 00:08:24.368 [2024-12-17 01:24:10.125591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.368 [2024-12-17 01:24:10.125621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.368 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:24.368 #22 NEW cov: 12462 ft: 14301 corp: 13/385b lim: 100 exec/s: 0 rss: 73Mb L: 25/53 MS: 1 InsertByte- 00:08:24.368 [2024-12-17 01:24:10.165914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.368 [2024-12-17 01:24:10.165941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.368 [2024-12-17 01:24:10.165990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.368 [2024-12-17 01:24:10.166006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.368 [2024-12-17 01:24:10.166063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.368 [2024-12-17 01:24:10.166078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.368 #23 NEW cov: 12462 ft: 14605 corp: 14/452b lim: 100 exec/s: 0 rss: 73Mb L: 67/67 MS: 1 CrossOver- 00:08:24.368 [2024-12-17 01:24:10.205761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.368 [2024-12-17 01:24:10.205789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.368 #24 NEW cov: 12462 ft: 14662 corp: 15/475b lim: 100 exec/s: 0 rss: 73Mb L: 23/67 MS: 1 ChangeBinInt- 00:08:24.368 [2024-12-17 01:24:10.245920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.368 [2024-12-17 01:24:10.245947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.368 #25 NEW cov: 12462 ft: 14693 corp: 16/499b lim: 100 exec/s: 25 rss: 73Mb L: 24/67 MS: 1 ChangeByte- 00:08:24.368 [2024-12-17 01:24:10.286009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.368 [2024-12-17 01:24:10.286035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.368 #26 NEW cov: 12462 ft: 14698 corp: 17/520b lim: 100 exec/s: 26 rss: 73Mb L: 21/67 MS: 1 ShuffleBytes- 00:08:24.368 [2024-12-17 01:24:10.326133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.368 [2024-12-17 01:24:10.326159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.368 #32 NEW cov: 12462 ft: 14706 corp: 18/541b lim: 100 exec/s: 32 rss: 73Mb L: 21/67 MS: 1 ChangeBinInt- 00:08:24.368 [2024-12-17 01:24:10.366239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.368 [2024-12-17 01:24:10.366265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 #33 NEW cov: 12462 ft: 14723 corp: 19/562b lim: 100 exec/s: 33 rss: 73Mb L: 21/67 MS: 1 ChangeBit- 00:08:24.626 [2024-12-17 01:24:10.406452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.626 [2024-12-17 01:24:10.406479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 [2024-12-17 01:24:10.406523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.626 [2024-12-17 01:24:10.406539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.626 #34 NEW cov: 12462 ft: 14777 corp: 20/606b lim: 100 exec/s: 34 rss: 73Mb L: 44/67 MS: 1 CrossOver- 00:08:24.626 [2024-12-17 01:24:10.446563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.626 [2024-12-17 01:24:10.446588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 [2024-12-17 01:24:10.446627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.626 [2024-12-17 01:24:10.446644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.626 #35 NEW cov: 12462 ft: 14861 corp: 21/652b lim: 100 exec/s: 35 rss: 73Mb L: 46/67 MS: 1 InsertRepeatedBytes- 00:08:24.626 [2024-12-17 01:24:10.506633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.626 [2024-12-17 01:24:10.506658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 #36 NEW cov: 12462 ft: 14913 corp: 22/672b lim: 100 exec/s: 36 rss: 73Mb L: 20/67 MS: 1 EraseBytes- 00:08:24.626 [2024-12-17 01:24:10.566938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.626 [2024-12-17 01:24:10.566964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 [2024-12-17 01:24:10.567018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.626 [2024-12-17 01:24:10.567033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.626 #37 NEW cov: 12462 ft: 14938 corp: 23/721b lim: 100 exec/s: 37 rss: 73Mb L: 49/67 MS: 1 InsertRepeatedBytes- 00:08:24.626 [2024-12-17 01:24:10.627125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.626 [2024-12-17 01:24:10.627150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 [2024-12-17 01:24:10.627206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.626 [2024-12-17 01:24:10.627222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.884 #38 NEW cov: 12462 ft: 14954 corp: 24/774b lim: 100 exec/s: 38 rss: 73Mb L: 53/67 MS: 1 CopyPart- 00:08:24.884 [2024-12-17 01:24:10.687286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.884 [2024-12-17 01:24:10.687312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.884 [2024-12-17 01:24:10.687366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.884 [2024-12-17 01:24:10.687380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.884 #39 NEW cov: 12462 ft: 15045 corp: 25/823b lim: 100 exec/s: 39 rss: 73Mb L: 49/67 MS: 1 ChangeBinInt- 00:08:24.884 [2024-12-17 01:24:10.727608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.884 [2024-12-17 01:24:10.727633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.884 [2024-12-17 01:24:10.727704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.884 [2024-12-17 01:24:10.727719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.884 [2024-12-17 01:24:10.727771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.884 [2024-12-17 01:24:10.727784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.884 [2024-12-17 01:24:10.727846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:24.884 [2024-12-17 01:24:10.727862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.884 #40 NEW cov: 12462 ft: 15326 corp: 26/904b lim: 100 exec/s: 40 rss: 73Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:08:24.884 [2024-12-17 01:24:10.767640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.884 [2024-12-17 01:24:10.767666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.884 [2024-12-17 01:24:10.767701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.884 [2024-12-17 01:24:10.767716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.884 [2024-12-17 01:24:10.767773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.884 [2024-12-17 01:24:10.767787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.884 #41 NEW cov: 12462 ft: 15367 corp: 27/971b lim: 100 exec/s: 41 rss: 73Mb L: 67/81 MS: 1 ShuffleBytes- 00:08:24.884 [2024-12-17 01:24:10.827566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.884 [2024-12-17 01:24:10.827593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.884 #42 NEW cov: 12462 ft: 15412 corp: 28/996b lim: 100 exec/s: 42 rss: 73Mb L: 25/81 MS: 1 CrossOver- 00:08:24.884 [2024-12-17 01:24:10.867662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.884 [2024-12-17 01:24:10.867688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.142 #43 NEW cov: 12462 ft: 15436 corp: 29/1021b lim: 100 exec/s: 43 rss: 73Mb L: 25/81 MS: 1 ChangeBit- 00:08:25.142 [2024-12-17 01:24:10.907930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.142 [2024-12-17 01:24:10.907956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.142 [2024-12-17 01:24:10.907994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.143 [2024-12-17 01:24:10.908007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.143 #44 NEW cov: 12462 ft: 15443 corp: 30/1067b lim: 100 exec/s: 44 rss: 74Mb L: 46/81 MS: 1 ChangeBit- 00:08:25.143 [2024-12-17 01:24:10.968132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.143 [2024-12-17 01:24:10.968157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.143 [2024-12-17 01:24:10.968195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.143 [2024-12-17 01:24:10.968209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.143 #45 NEW cov: 12462 ft: 15459 corp: 31/1113b lim: 100 exec/s: 45 rss: 74Mb L: 46/81 MS: 1 CopyPart- 00:08:25.143 [2024-12-17 01:24:11.008178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.143 [2024-12-17 01:24:11.008203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.143 [2024-12-17 01:24:11.008241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.143 [2024-12-17 01:24:11.008256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.143 #46 NEW cov: 12462 ft: 15467 corp: 32/1161b lim: 100 exec/s: 46 rss: 74Mb L: 48/81 MS: 1 ChangeBinInt- 00:08:25.143 [2024-12-17 01:24:11.048251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.143 [2024-12-17 01:24:11.048276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.143 [2024-12-17 01:24:11.048316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.143 [2024-12-17 01:24:11.048334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.143 #47 NEW cov: 12462 ft: 15478 corp: 33/1209b lim: 100 exec/s: 47 rss: 74Mb L: 48/81 MS: 1 ChangeByte- 00:08:25.143 [2024-12-17 01:24:11.108475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.143 [2024-12-17 01:24:11.108501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.143 [2024-12-17 01:24:11.108538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.143 [2024-12-17 01:24:11.108553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.143 #48 NEW cov: 12462 ft: 15505 corp: 34/1257b lim: 100 exec/s: 48 rss: 74Mb L: 48/81 MS: 1 ChangeBit- 00:08:25.401 [2024-12-17 01:24:11.148590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.401 [2024-12-17 01:24:11.148615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.401 [2024-12-17 01:24:11.148658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.401 [2024-12-17 01:24:11.148674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.401 #49 NEW cov: 12462 ft: 15510 corp: 35/1303b lim: 100 exec/s: 49 rss: 74Mb L: 46/81 MS: 1 ChangeBinInt- 00:08:25.401 [2024-12-17 01:24:11.208628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.401 [2024-12-17 01:24:11.208656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.401 #50 NEW cov: 12462 ft: 15523 corp: 36/1324b lim: 100 exec/s: 50 rss: 74Mb L: 21/81 MS: 1 ChangeBinInt- 00:08:25.401 [2024-12-17 01:24:11.248828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.401 [2024-12-17 01:24:11.248853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.401 [2024-12-17 01:24:11.248907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.401 [2024-12-17 01:24:11.248923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.401 #51 NEW cov: 12462 ft: 15545 corp: 37/1373b lim: 100 exec/s: 25 rss: 74Mb L: 49/81 MS: 1 InsertByte- 00:08:25.401 #51 DONE cov: 12462 ft: 15545 corp: 37/1373b lim: 100 exec/s: 25 rss: 74Mb 00:08:25.401 Done 51 runs in 2 second(s) 00:08:25.401 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:25.401 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:25.401 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.402 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:25.402 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:25.402 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:25.402 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.402 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:25.402 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:25.402 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:25.402 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:25.660 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:25.660 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:25.660 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:25.660 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:25.660 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.660 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.660 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:25.660 01:24:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:25.660 [2024-12-17 01:24:11.440281] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:25.660 [2024-12-17 01:24:11.440358] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid832151 ] 00:08:25.660 [2024-12-17 01:24:11.621003] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.660 [2024-12-17 01:24:11.642745] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.919 [2024-12-17 01:24:11.695521] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.919 [2024-12-17 01:24:11.711868] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:25.919 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.919 INFO: Seed: 1047489998 00:08:25.919 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:25.919 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:25.919 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:25.919 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.919 #2 INITED exec/s: 0 rss: 64Mb 00:08:25.919 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.919 This may also happen if the target rejected all inputs we tried so far 00:08:25.919 [2024-12-17 01:24:11.778109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:25.919 [2024-12-17 01:24:11.778146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.177 NEW_FUNC[1/713]: 0x472ef8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:26.177 NEW_FUNC[2/713]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.177 #10 NEW cov: 12206 ft: 12205 corp: 2/15b lim: 50 exec/s: 0 rss: 72Mb L: 14/14 MS: 3 CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:26.177 [2024-12-17 01:24:12.108621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:26.177 [2024-12-17 01:24:12.108666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.177 NEW_FUNC[1/1]: 0x1a198c8 in nvme_tcp_ctrlr_connect_qpair_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:2410 00:08:26.177 #11 NEW cov: 12326 ft: 12832 corp: 3/30b lim: 50 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 InsertByte- 00:08:26.177 [2024-12-17 01:24:12.168649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772205 len:1 00:08:26.177 [2024-12-17 01:24:12.168684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.436 #12 NEW cov: 12332 ft: 13116 corp: 4/46b lim: 50 exec/s: 0 rss: 72Mb L: 16/16 MS: 1 InsertByte- 00:08:26.436 [2024-12-17 01:24:12.228897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:26.436 [2024-12-17 01:24:12.228926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.436 #13 NEW cov: 12417 ft: 13531 corp: 5/61b lim: 50 exec/s: 0 rss: 72Mb L: 15/16 MS: 1 ChangeBit- 00:08:26.436 [2024-12-17 01:24:12.268953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 00:08:26.436 [2024-12-17 01:24:12.268985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.436 #14 NEW cov: 12417 ft: 13652 corp: 6/75b lim: 50 exec/s: 0 rss: 72Mb L: 14/16 MS: 1 ChangeBit- 00:08:26.436 [2024-12-17 01:24:12.309172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:26.436 [2024-12-17 01:24:12.309200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.436 #15 NEW cov: 12417 ft: 13773 corp: 7/90b lim: 50 exec/s: 0 rss: 72Mb L: 15/16 MS: 1 ShuffleBytes- 00:08:26.436 [2024-12-17 01:24:12.349367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772205 len:1 00:08:26.436 [2024-12-17 01:24:12.349400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.436 [2024-12-17 01:24:12.349519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:26.436 [2024-12-17 01:24:12.349551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.436 #16 NEW cov: 12417 ft: 14128 corp: 8/113b lim: 50 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:26.436 [2024-12-17 01:24:12.409463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:170393600 len:1 00:08:26.436 [2024-12-17 01:24:12.409490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.695 #17 NEW cov: 12417 ft: 14168 corp: 9/129b lim: 50 exec/s: 0 rss: 72Mb L: 16/23 MS: 1 InsertByte- 00:08:26.695 [2024-12-17 01:24:12.469595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:35186687344640 len:1 00:08:26.695 [2024-12-17 01:24:12.469628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.695 #18 NEW cov: 12417 ft: 14228 corp: 10/143b lim: 50 exec/s: 0 rss: 73Mb L: 14/23 MS: 1 ChangeBit- 00:08:26.695 [2024-12-17 01:24:12.539763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10977524259487744 len:1 00:08:26.695 [2024-12-17 01:24:12.539796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.695 #19 NEW cov: 12417 ft: 14327 corp: 11/159b lim: 50 exec/s: 0 rss: 73Mb L: 16/23 MS: 1 InsertByte- 00:08:26.695 [2024-12-17 01:24:12.609946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:35186687344640 len:1 00:08:26.695 [2024-12-17 01:24:12.609978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.695 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:26.695 #20 NEW cov: 12440 ft: 14351 corp: 12/173b lim: 50 exec/s: 0 rss: 73Mb L: 14/23 MS: 1 ChangeBit- 00:08:26.695 [2024-12-17 01:24:12.680206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:26.695 [2024-12-17 01:24:12.680232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.954 #21 NEW cov: 12440 ft: 14400 corp: 13/188b lim: 50 exec/s: 0 rss: 73Mb L: 15/23 MS: 1 ChangeBit- 00:08:26.954 [2024-12-17 01:24:12.730368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:26.954 [2024-12-17 01:24:12.730400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.954 #22 NEW cov: 12440 ft: 14469 corp: 14/203b lim: 50 exec/s: 22 rss: 73Mb L: 15/23 MS: 1 InsertByte- 00:08:26.954 [2024-12-17 01:24:12.780543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:35186687344640 len:1 00:08:26.954 [2024-12-17 01:24:12.780571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.954 #23 NEW cov: 12440 ft: 14479 corp: 15/217b lim: 50 exec/s: 23 rss: 73Mb L: 14/23 MS: 1 CopyPart- 00:08:26.954 [2024-12-17 01:24:12.850757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10977524259487744 len:1 00:08:26.954 [2024-12-17 01:24:12.850784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.954 #24 NEW cov: 12440 ft: 14494 corp: 16/233b lim: 50 exec/s: 24 rss: 73Mb L: 16/23 MS: 1 ChangeByte- 00:08:26.954 [2024-12-17 01:24:12.921117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:170393600 len:1 00:08:26.954 [2024-12-17 01:24:12.921152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.954 [2024-12-17 01:24:12.921264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:665600 len:1 00:08:26.954 [2024-12-17 01:24:12.921289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.212 #25 NEW cov: 12440 ft: 14553 corp: 17/259b lim: 50 exec/s: 25 rss: 73Mb L: 26/26 MS: 1 CopyPart- 00:08:27.212 [2024-12-17 01:24:12.991132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:27.212 [2024-12-17 01:24:12.991166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.212 #26 NEW cov: 12440 ft: 14562 corp: 18/274b lim: 50 exec/s: 26 rss: 73Mb L: 15/26 MS: 1 ChangeBinInt- 00:08:27.212 [2024-12-17 01:24:13.061306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:27.213 [2024-12-17 01:24:13.061333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.213 #27 NEW cov: 12440 ft: 14571 corp: 19/292b lim: 50 exec/s: 27 rss: 73Mb L: 18/26 MS: 1 CrossOver- 00:08:27.213 [2024-12-17 01:24:13.111883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070672875519 len:65536 00:08:27.213 [2024-12-17 01:24:13.111926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.213 [2024-12-17 01:24:13.112026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:27.213 [2024-12-17 01:24:13.112052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.213 [2024-12-17 01:24:13.112178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:27.213 [2024-12-17 01:24:13.112197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.213 #34 NEW cov: 12440 ft: 14952 corp: 20/322b lim: 50 exec/s: 34 rss: 73Mb L: 30/30 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:27.213 [2024-12-17 01:24:13.161733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:27.213 [2024-12-17 01:24:13.161760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.213 #35 NEW cov: 12440 ft: 14956 corp: 21/340b lim: 50 exec/s: 35 rss: 73Mb L: 18/30 MS: 1 CrossOver- 00:08:27.471 [2024-12-17 01:24:13.231935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374686479856173055 len:1 00:08:27.471 [2024-12-17 01:24:13.231962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.471 #41 NEW cov: 12440 ft: 15042 corp: 22/359b lim: 50 exec/s: 41 rss: 73Mb L: 19/30 MS: 1 InsertRepeatedBytes- 00:08:27.471 [2024-12-17 01:24:13.282003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:27.471 [2024-12-17 01:24:13.282038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.471 #42 NEW cov: 12440 ft: 15070 corp: 23/373b lim: 50 exec/s: 42 rss: 73Mb L: 14/30 MS: 1 CrossOver- 00:08:27.471 [2024-12-17 01:24:13.352414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772205 len:31233 00:08:27.471 [2024-12-17 01:24:13.352448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.471 [2024-12-17 01:24:13.352562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:27.471 [2024-12-17 01:24:13.352588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.471 #43 NEW cov: 12440 ft: 15099 corp: 24/397b lim: 50 exec/s: 43 rss: 74Mb L: 24/30 MS: 1 InsertByte- 00:08:27.471 [2024-12-17 01:24:13.422611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10977524259487744 len:50373 00:08:27.471 [2024-12-17 01:24:13.422643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.471 [2024-12-17 01:24:13.422753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14178673876263027908 len:50373 00:08:27.471 [2024-12-17 01:24:13.422772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.471 #44 NEW cov: 12440 ft: 15111 corp: 25/426b lim: 50 exec/s: 44 rss: 74Mb L: 29/30 MS: 1 InsertRepeatedBytes- 00:08:27.729 [2024-12-17 01:24:13.492657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:27.729 [2024-12-17 01:24:13.492683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.729 #45 NEW cov: 12440 ft: 15122 corp: 26/436b lim: 50 exec/s: 45 rss: 74Mb L: 10/30 MS: 1 EraseBytes- 00:08:27.729 [2024-12-17 01:24:13.532732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:27.729 [2024-12-17 01:24:13.532763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.729 #46 NEW cov: 12440 ft: 15135 corp: 27/452b lim: 50 exec/s: 46 rss: 74Mb L: 16/30 MS: 1 InsertByte- 00:08:27.730 [2024-12-17 01:24:13.572995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772205 len:31233 00:08:27.730 [2024-12-17 01:24:13.573028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.730 [2024-12-17 01:24:13.573139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:40894 00:08:27.730 [2024-12-17 01:24:13.573162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.730 #47 NEW cov: 12440 ft: 15141 corp: 28/472b lim: 50 exec/s: 47 rss: 74Mb L: 20/30 MS: 1 EraseBytes- 00:08:27.730 [2024-12-17 01:24:13.633317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070672875519 len:65536 00:08:27.730 [2024-12-17 01:24:13.633349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.730 [2024-12-17 01:24:13.633448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:27.730 [2024-12-17 01:24:13.633466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.730 [2024-12-17 01:24:13.633574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18442240474082181119 len:65536 00:08:27.730 [2024-12-17 01:24:13.633595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.730 #48 NEW cov: 12440 ft: 15176 corp: 29/502b lim: 50 exec/s: 48 rss: 74Mb L: 30/30 MS: 1 ChangeBit- 00:08:27.730 [2024-12-17 01:24:13.693299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4629771062874210368 len:16449 00:08:27.730 [2024-12-17 01:24:13.693331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.730 [2024-12-17 01:24:13.693442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4629770786759000128 len:1 00:08:27.730 [2024-12-17 01:24:13.693464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.730 #49 NEW cov: 12440 ft: 15180 corp: 30/530b lim: 50 exec/s: 49 rss: 74Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:08:27.988 [2024-12-17 01:24:13.743467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:35186687344640 len:1 00:08:27.988 [2024-12-17 01:24:13.743494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.988 #50 NEW cov: 12440 ft: 15211 corp: 31/544b lim: 50 exec/s: 25 rss: 74Mb L: 14/30 MS: 1 ShuffleBytes- 00:08:27.988 #50 DONE cov: 12440 ft: 15211 corp: 31/544b lim: 50 exec/s: 25 rss: 74Mb 00:08:27.988 Done 50 runs in 2 second(s) 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:27.989 01:24:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:27.989 [2024-12-17 01:24:13.940146] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:27.989 [2024-12-17 01:24:13.940234] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid832634 ] 00:08:28.247 [2024-12-17 01:24:14.125214] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.247 [2024-12-17 01:24:14.147965] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.247 [2024-12-17 01:24:14.200239] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.247 [2024-12-17 01:24:14.216527] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:28.247 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.247 INFO: Seed: 3550466449 00:08:28.247 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:28.247 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:28.247 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:28.247 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.247 #2 INITED exec/s: 0 rss: 64Mb 00:08:28.247 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.247 This may also happen if the target rejected all inputs we tried so far 00:08:28.505 [2024-12-17 01:24:14.265611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.505 [2024-12-17 01:24:14.265640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.505 [2024-12-17 01:24:14.265692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.505 [2024-12-17 01:24:14.265708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.505 [2024-12-17 01:24:14.265767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.505 [2024-12-17 01:24:14.265783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.764 NEW_FUNC[1/716]: 0x474ab8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:28.764 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.764 #5 NEW cov: 12267 ft: 12265 corp: 2/72b lim: 90 exec/s: 0 rss: 72Mb L: 71/71 MS: 3 ChangeBit-CrossOver-InsertRepeatedBytes- 00:08:28.764 [2024-12-17 01:24:14.596655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.764 [2024-12-17 01:24:14.596698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.596779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.764 [2024-12-17 01:24:14.596806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.596876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.764 [2024-12-17 01:24:14.596897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.596964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:28.764 [2024-12-17 01:24:14.596984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.764 #10 NEW cov: 12384 ft: 13209 corp: 3/148b lim: 90 exec/s: 0 rss: 72Mb L: 76/76 MS: 5 ChangeByte-ChangeByte-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:28.764 [2024-12-17 01:24:14.636496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.764 [2024-12-17 01:24:14.636526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.636578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.764 [2024-12-17 01:24:14.636595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.636654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.764 [2024-12-17 01:24:14.636671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.764 #11 NEW cov: 12390 ft: 13465 corp: 4/219b lim: 90 exec/s: 0 rss: 72Mb L: 71/76 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:28.764 [2024-12-17 01:24:14.696687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.764 [2024-12-17 01:24:14.696716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.696759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.764 [2024-12-17 01:24:14.696776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.696831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.764 [2024-12-17 01:24:14.696847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.764 #12 NEW cov: 12475 ft: 13842 corp: 5/290b lim: 90 exec/s: 0 rss: 72Mb L: 71/76 MS: 1 ChangeBinInt- 00:08:28.764 [2024-12-17 01:24:14.736956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:28.764 [2024-12-17 01:24:14.736983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.737036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:28.764 [2024-12-17 01:24:14.737053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.737111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:28.764 [2024-12-17 01:24:14.737125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.764 [2024-12-17 01:24:14.737184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:28.764 [2024-12-17 01:24:14.737200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.023 #13 NEW cov: 12475 ft: 14024 corp: 6/376b lim: 90 exec/s: 0 rss: 72Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:08:29.023 [2024-12-17 01:24:14.796945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.023 [2024-12-17 01:24:14.796977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.797014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.023 [2024-12-17 01:24:14.797032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.797095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.023 [2024-12-17 01:24:14.797110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.023 #14 NEW cov: 12475 ft: 14146 corp: 7/447b lim: 90 exec/s: 0 rss: 72Mb L: 71/86 MS: 1 ChangeBinInt- 00:08:29.023 [2024-12-17 01:24:14.837057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.023 [2024-12-17 01:24:14.837084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.837124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.023 [2024-12-17 01:24:14.837138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.837196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.023 [2024-12-17 01:24:14.837212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.023 #15 NEW cov: 12475 ft: 14215 corp: 8/514b lim: 90 exec/s: 0 rss: 72Mb L: 67/86 MS: 1 EraseBytes- 00:08:29.023 [2024-12-17 01:24:14.877338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.023 [2024-12-17 01:24:14.877365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.877429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.023 [2024-12-17 01:24:14.877445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.877501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.023 [2024-12-17 01:24:14.877519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.877578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.023 [2024-12-17 01:24:14.877595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.023 #16 NEW cov: 12475 ft: 14256 corp: 9/598b lim: 90 exec/s: 0 rss: 72Mb L: 84/86 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:29.023 [2024-12-17 01:24:14.937516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.023 [2024-12-17 01:24:14.937543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.937591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.023 [2024-12-17 01:24:14.937607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.937663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.023 [2024-12-17 01:24:14.937680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.937743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.023 [2024-12-17 01:24:14.937758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.023 #17 NEW cov: 12475 ft: 14307 corp: 10/682b lim: 90 exec/s: 0 rss: 72Mb L: 84/86 MS: 1 ChangeByte- 00:08:29.023 [2024-12-17 01:24:14.997692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.023 [2024-12-17 01:24:14.997719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.997772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.023 [2024-12-17 01:24:14.997788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.997863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.023 [2024-12-17 01:24:14.997877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.023 [2024-12-17 01:24:14.997934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.023 [2024-12-17 01:24:14.997950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.023 #18 NEW cov: 12475 ft: 14353 corp: 11/759b lim: 90 exec/s: 0 rss: 72Mb L: 77/86 MS: 1 InsertByte- 00:08:29.288 [2024-12-17 01:24:15.037453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.288 [2024-12-17 01:24:15.037480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.288 [2024-12-17 01:24:15.037536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.288 [2024-12-17 01:24:15.037553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.288 #19 NEW cov: 12475 ft: 14706 corp: 12/805b lim: 90 exec/s: 0 rss: 73Mb L: 46/86 MS: 1 EraseBytes- 00:08:29.288 [2024-12-17 01:24:15.097942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.288 [2024-12-17 01:24:15.097969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.288 [2024-12-17 01:24:15.098018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.288 [2024-12-17 01:24:15.098035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.288 [2024-12-17 01:24:15.098092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.289 [2024-12-17 01:24:15.098108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.098166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.289 [2024-12-17 01:24:15.098181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.289 #20 NEW cov: 12475 ft: 14713 corp: 13/881b lim: 90 exec/s: 0 rss: 73Mb L: 76/86 MS: 1 ChangeBinInt- 00:08:29.289 [2024-12-17 01:24:15.138235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.289 [2024-12-17 01:24:15.138263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.138313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.289 [2024-12-17 01:24:15.138333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.138410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.289 [2024-12-17 01:24:15.138426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.138484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.289 [2024-12-17 01:24:15.138501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.138561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:29.289 [2024-12-17 01:24:15.138577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.289 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:29.289 #21 NEW cov: 12498 ft: 14801 corp: 14/971b lim: 90 exec/s: 0 rss: 73Mb L: 90/90 MS: 1 CrossOver- 00:08:29.289 [2024-12-17 01:24:15.178198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.289 [2024-12-17 01:24:15.178226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.178275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.289 [2024-12-17 01:24:15.178292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.178351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.289 [2024-12-17 01:24:15.178367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.178429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.289 [2024-12-17 01:24:15.178444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.289 #22 NEW cov: 12498 ft: 14803 corp: 15/1046b lim: 90 exec/s: 0 rss: 73Mb L: 75/90 MS: 1 InsertRepeatedBytes- 00:08:29.289 [2024-12-17 01:24:15.218296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.289 [2024-12-17 01:24:15.218323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.218393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.289 [2024-12-17 01:24:15.218410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.218467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.289 [2024-12-17 01:24:15.218482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.218540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.289 [2024-12-17 01:24:15.218557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.289 #23 NEW cov: 12498 ft: 14821 corp: 16/1118b lim: 90 exec/s: 23 rss: 73Mb L: 72/90 MS: 1 InsertByte- 00:08:29.289 [2024-12-17 01:24:15.278645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.289 [2024-12-17 01:24:15.278673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.278727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.289 [2024-12-17 01:24:15.278744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.278800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.289 [2024-12-17 01:24:15.278816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.278869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.289 [2024-12-17 01:24:15.278884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.289 [2024-12-17 01:24:15.278942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:29.289 [2024-12-17 01:24:15.278958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.546 #24 NEW cov: 12498 ft: 14857 corp: 17/1208b lim: 90 exec/s: 24 rss: 73Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:29.546 [2024-12-17 01:24:15.338644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.546 [2024-12-17 01:24:15.338671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.338741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.546 [2024-12-17 01:24:15.338758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.338816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.546 [2024-12-17 01:24:15.338832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.338886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.546 [2024-12-17 01:24:15.338902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.546 #25 NEW cov: 12498 ft: 14900 corp: 18/1280b lim: 90 exec/s: 25 rss: 73Mb L: 72/90 MS: 1 InsertByte- 00:08:29.546 [2024-12-17 01:24:15.378585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.546 [2024-12-17 01:24:15.378611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.378669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.546 [2024-12-17 01:24:15.378685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.378743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.546 [2024-12-17 01:24:15.378759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.546 #26 NEW cov: 12498 ft: 14952 corp: 19/1341b lim: 90 exec/s: 26 rss: 73Mb L: 61/90 MS: 1 EraseBytes- 00:08:29.546 [2024-12-17 01:24:15.418539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.546 [2024-12-17 01:24:15.418565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.418621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.546 [2024-12-17 01:24:15.418638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.546 #32 NEW cov: 12498 ft: 14993 corp: 20/1390b lim: 90 exec/s: 32 rss: 73Mb L: 49/90 MS: 1 EraseBytes- 00:08:29.546 [2024-12-17 01:24:15.458823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.546 [2024-12-17 01:24:15.458850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.458915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.546 [2024-12-17 01:24:15.458931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.458990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.546 [2024-12-17 01:24:15.459007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.546 #33 NEW cov: 12498 ft: 15020 corp: 21/1461b lim: 90 exec/s: 33 rss: 73Mb L: 71/90 MS: 1 CrossOver- 00:08:29.546 [2024-12-17 01:24:15.499100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.546 [2024-12-17 01:24:15.499127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.499176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.546 [2024-12-17 01:24:15.499192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.499249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.546 [2024-12-17 01:24:15.499265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.546 [2024-12-17 01:24:15.499324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.546 [2024-12-17 01:24:15.499340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.546 #34 NEW cov: 12498 ft: 15120 corp: 22/1544b lim: 90 exec/s: 34 rss: 73Mb L: 83/90 MS: 1 InsertRepeatedBytes- 00:08:29.804 [2024-12-17 01:24:15.559294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.804 [2024-12-17 01:24:15.559321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.804 [2024-12-17 01:24:15.559378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.804 [2024-12-17 01:24:15.559394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.804 [2024-12-17 01:24:15.559450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.804 [2024-12-17 01:24:15.559466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.804 [2024-12-17 01:24:15.559523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.804 [2024-12-17 01:24:15.559539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.804 #35 NEW cov: 12498 ft: 15132 corp: 23/1621b lim: 90 exec/s: 35 rss: 73Mb L: 77/90 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:29.804 [2024-12-17 01:24:15.619472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.804 [2024-12-17 01:24:15.619498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.804 [2024-12-17 01:24:15.619556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.804 [2024-12-17 01:24:15.619573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.804 [2024-12-17 01:24:15.619631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.804 [2024-12-17 01:24:15.619645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.619702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.805 [2024-12-17 01:24:15.619719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.805 #36 NEW cov: 12498 ft: 15141 corp: 24/1705b lim: 90 exec/s: 36 rss: 73Mb L: 84/90 MS: 1 ShuffleBytes- 00:08:29.805 [2024-12-17 01:24:15.659570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.805 [2024-12-17 01:24:15.659597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.659672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.805 [2024-12-17 01:24:15.659688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.659744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.805 [2024-12-17 01:24:15.659760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.659827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.805 [2024-12-17 01:24:15.659842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.805 #37 NEW cov: 12498 ft: 15144 corp: 25/1782b lim: 90 exec/s: 37 rss: 73Mb L: 77/90 MS: 1 InsertByte- 00:08:29.805 [2024-12-17 01:24:15.699666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.805 [2024-12-17 01:24:15.699692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.699751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.805 [2024-12-17 01:24:15.699768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.699844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.805 [2024-12-17 01:24:15.699861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.699919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.805 [2024-12-17 01:24:15.699935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.805 #38 NEW cov: 12498 ft: 15156 corp: 26/1859b lim: 90 exec/s: 38 rss: 73Mb L: 77/90 MS: 1 InsertByte- 00:08:29.805 [2024-12-17 01:24:15.759817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.805 [2024-12-17 01:24:15.759844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.759902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.805 [2024-12-17 01:24:15.759918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.759978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.805 [2024-12-17 01:24:15.759995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.805 [2024-12-17 01:24:15.760054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.805 [2024-12-17 01:24:15.760071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.805 #39 NEW cov: 12498 ft: 15162 corp: 27/1931b lim: 90 exec/s: 39 rss: 73Mb L: 72/90 MS: 1 ChangeASCIIInt- 00:08:30.063 [2024-12-17 01:24:15.819982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.063 [2024-12-17 01:24:15.820009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.820060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.063 [2024-12-17 01:24:15.820076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.820134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.063 [2024-12-17 01:24:15.820151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.820208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.063 [2024-12-17 01:24:15.820226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.063 #40 NEW cov: 12498 ft: 15169 corp: 28/2003b lim: 90 exec/s: 40 rss: 73Mb L: 72/90 MS: 1 ChangeBit- 00:08:30.063 [2024-12-17 01:24:15.879705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.063 [2024-12-17 01:24:15.879732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.063 #42 NEW cov: 12498 ft: 15911 corp: 29/2025b lim: 90 exec/s: 42 rss: 73Mb L: 22/90 MS: 2 PersAutoDict-InsertRepeatedBytes- DE: "\002\000\000\000\000\000\000\000"- 00:08:30.063 [2024-12-17 01:24:15.920136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.063 [2024-12-17 01:24:15.920164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.920218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.063 [2024-12-17 01:24:15.920235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.920294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.063 [2024-12-17 01:24:15.920310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.960446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.063 [2024-12-17 01:24:15.960473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.960529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.063 [2024-12-17 01:24:15.960545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.960606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.063 [2024-12-17 01:24:15.960624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:15.960685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.063 [2024-12-17 01:24:15.960700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.063 #44 NEW cov: 12498 ft: 15919 corp: 30/2114b lim: 90 exec/s: 44 rss: 73Mb L: 89/90 MS: 2 ShuffleBytes-CopyPart- 00:08:30.063 [2024-12-17 01:24:16.000508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.063 [2024-12-17 01:24:16.000534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:16.000587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.063 [2024-12-17 01:24:16.000603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:16.000660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.063 [2024-12-17 01:24:16.000676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:16.000736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.063 [2024-12-17 01:24:16.000753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.063 #45 NEW cov: 12498 ft: 15926 corp: 31/2199b lim: 90 exec/s: 45 rss: 73Mb L: 85/90 MS: 1 InsertRepeatedBytes- 00:08:30.063 [2024-12-17 01:24:16.040509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.063 [2024-12-17 01:24:16.040535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:16.040575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.063 [2024-12-17 01:24:16.040592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.063 [2024-12-17 01:24:16.040652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.063 [2024-12-17 01:24:16.040669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.063 #46 NEW cov: 12498 ft: 15937 corp: 32/2267b lim: 90 exec/s: 46 rss: 74Mb L: 68/90 MS: 1 InsertByte- 00:08:30.321 [2024-12-17 01:24:16.080419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.321 [2024-12-17 01:24:16.080446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.321 [2024-12-17 01:24:16.080485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.321 [2024-12-17 01:24:16.080501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.321 #49 NEW cov: 12498 ft: 15945 corp: 33/2306b lim: 90 exec/s: 49 rss: 74Mb L: 39/90 MS: 3 InsertByte-InsertByte-CrossOver- 00:08:30.321 [2024-12-17 01:24:16.120907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.321 [2024-12-17 01:24:16.120934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.321 [2024-12-17 01:24:16.120991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.321 [2024-12-17 01:24:16.121004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.321 [2024-12-17 01:24:16.121063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.321 [2024-12-17 01:24:16.121078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.321 [2024-12-17 01:24:16.121135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.321 [2024-12-17 01:24:16.121149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.322 #50 NEW cov: 12498 ft: 15961 corp: 34/2383b lim: 90 exec/s: 50 rss: 74Mb L: 77/90 MS: 1 InsertByte- 00:08:30.322 [2024-12-17 01:24:16.160952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.322 [2024-12-17 01:24:16.160980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.322 [2024-12-17 01:24:16.161034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.322 [2024-12-17 01:24:16.161050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.322 [2024-12-17 01:24:16.161109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.322 [2024-12-17 01:24:16.161123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.322 [2024-12-17 01:24:16.161183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.322 [2024-12-17 01:24:16.161200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.322 #51 NEW cov: 12498 ft: 15973 corp: 35/2455b lim: 90 exec/s: 51 rss: 74Mb L: 72/90 MS: 1 ChangeByte- 00:08:30.322 [2024-12-17 01:24:16.221184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.322 [2024-12-17 01:24:16.221211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.322 [2024-12-17 01:24:16.221265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.322 [2024-12-17 01:24:16.221281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.322 [2024-12-17 01:24:16.221355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.322 [2024-12-17 01:24:16.221371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.322 [2024-12-17 01:24:16.221431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.322 [2024-12-17 01:24:16.221447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.322 #52 NEW cov: 12498 ft: 15980 corp: 36/2530b lim: 90 exec/s: 26 rss: 74Mb L: 75/90 MS: 1 ChangeByte- 00:08:30.322 #52 DONE cov: 12498 ft: 15980 corp: 36/2530b lim: 90 exec/s: 26 rss: 74Mb 00:08:30.322 ###### Recommended dictionary. ###### 00:08:30.322 "\002\000\000\000\000\000\000\000" # Uses: 3 00:08:30.322 ###### End of recommended dictionary. ###### 00:08:30.322 Done 52 runs in 2 second(s) 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:30.580 01:24:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:30.580 [2024-12-17 01:24:16.414107] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:30.580 [2024-12-17 01:24:16.414194] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid833138 ] 00:08:30.839 [2024-12-17 01:24:16.593174] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.839 [2024-12-17 01:24:16.615949] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.839 [2024-12-17 01:24:16.668476] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.839 [2024-12-17 01:24:16.684812] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:30.839 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.839 INFO: Seed: 1724512585 00:08:30.839 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:30.839 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:30.839 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:30.839 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.839 #2 INITED exec/s: 0 rss: 64Mb 00:08:30.839 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.839 This may also happen if the target rejected all inputs we tried so far 00:08:30.839 [2024-12-17 01:24:16.730551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:30.839 [2024-12-17 01:24:16.730582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.839 [2024-12-17 01:24:16.730637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:30.839 [2024-12-17 01:24:16.730654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.839 [2024-12-17 01:24:16.730711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:30.839 [2024-12-17 01:24:16.730726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.839 [2024-12-17 01:24:16.730787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:30.839 [2024-12-17 01:24:16.730807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.097 NEW_FUNC[1/716]: 0x477ce8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:31.097 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.097 #6 NEW cov: 12246 ft: 12244 corp: 2/45b lim: 50 exec/s: 0 rss: 72Mb L: 44/44 MS: 4 CrossOver-CMP-ChangeBit-InsertRepeatedBytes- DE: "\377\377\377\377"- 00:08:31.097 [2024-12-17 01:24:17.062487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.097 [2024-12-17 01:24:17.062552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.097 [2024-12-17 01:24:17.062689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.097 [2024-12-17 01:24:17.062726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.097 [2024-12-17 01:24:17.062877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.097 [2024-12-17 01:24:17.062913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.097 [2024-12-17 01:24:17.063059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.097 [2024-12-17 01:24:17.063096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.356 #7 NEW cov: 12359 ft: 13044 corp: 3/89b lim: 50 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 ShuffleBytes- 00:08:31.356 [2024-12-17 01:24:17.132516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.356 [2024-12-17 01:24:17.132549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.132670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.356 [2024-12-17 01:24:17.132691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.132817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.356 [2024-12-17 01:24:17.132845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.132977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.356 [2024-12-17 01:24:17.133002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.356 #8 NEW cov: 12365 ft: 13211 corp: 4/133b lim: 50 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\015"- 00:08:31.356 [2024-12-17 01:24:17.182674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.356 [2024-12-17 01:24:17.182709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.182802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.356 [2024-12-17 01:24:17.182826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.182948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.356 [2024-12-17 01:24:17.182971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.183097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.356 [2024-12-17 01:24:17.183118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.356 #9 NEW cov: 12450 ft: 13421 corp: 5/177b lim: 50 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 CMP- DE: "\000\000\000\007"- 00:08:31.356 [2024-12-17 01:24:17.232746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.356 [2024-12-17 01:24:17.232779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.232894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.356 [2024-12-17 01:24:17.232923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.233051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.356 [2024-12-17 01:24:17.233078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.233209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.356 [2024-12-17 01:24:17.233237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.356 #10 NEW cov: 12450 ft: 13458 corp: 6/226b lim: 50 exec/s: 0 rss: 72Mb L: 49/49 MS: 1 CrossOver- 00:08:31.356 [2024-12-17 01:24:17.302934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.356 [2024-12-17 01:24:17.302964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.303029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.356 [2024-12-17 01:24:17.303056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.303170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.356 [2024-12-17 01:24:17.303190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.356 [2024-12-17 01:24:17.303317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.356 [2024-12-17 01:24:17.303337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.356 #11 NEW cov: 12450 ft: 13588 corp: 7/275b lim: 50 exec/s: 0 rss: 72Mb L: 49/49 MS: 1 CrossOver- 00:08:31.615 [2024-12-17 01:24:17.373208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.615 [2024-12-17 01:24:17.373237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.615 [2024-12-17 01:24:17.373312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.615 [2024-12-17 01:24:17.373353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.615 [2024-12-17 01:24:17.373479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.615 [2024-12-17 01:24:17.373511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.615 [2024-12-17 01:24:17.373640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.615 [2024-12-17 01:24:17.373672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.615 #12 NEW cov: 12450 ft: 13660 corp: 8/315b lim: 50 exec/s: 0 rss: 72Mb L: 40/49 MS: 1 InsertRepeatedBytes- 00:08:31.615 [2024-12-17 01:24:17.423357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.615 [2024-12-17 01:24:17.423390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.615 [2024-12-17 01:24:17.423500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.616 [2024-12-17 01:24:17.423518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.423639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.616 [2024-12-17 01:24:17.423663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.423777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.616 [2024-12-17 01:24:17.423798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.616 #13 NEW cov: 12450 ft: 13720 corp: 9/359b lim: 50 exec/s: 0 rss: 72Mb L: 44/49 MS: 1 ChangeBit- 00:08:31.616 [2024-12-17 01:24:17.493639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.616 [2024-12-17 01:24:17.493671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.493741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.616 [2024-12-17 01:24:17.493766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.493896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.616 [2024-12-17 01:24:17.493921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.494036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.616 [2024-12-17 01:24:17.494060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.616 #14 NEW cov: 12450 ft: 13762 corp: 10/403b lim: 50 exec/s: 0 rss: 72Mb L: 44/49 MS: 1 PersAutoDict- DE: "\000\000\000\007"- 00:08:31.616 [2024-12-17 01:24:17.563542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.616 [2024-12-17 01:24:17.563578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.563696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.616 [2024-12-17 01:24:17.563718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.563848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.616 [2024-12-17 01:24:17.563869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.616 #15 NEW cov: 12450 ft: 14150 corp: 11/440b lim: 50 exec/s: 0 rss: 72Mb L: 37/49 MS: 1 EraseBytes- 00:08:31.616 [2024-12-17 01:24:17.614000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.616 [2024-12-17 01:24:17.614028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.614108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.616 [2024-12-17 01:24:17.614126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.614242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.616 [2024-12-17 01:24:17.614269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.616 [2024-12-17 01:24:17.614396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.616 [2024-12-17 01:24:17.614423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.876 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:31.876 #16 NEW cov: 12473 ft: 14200 corp: 12/480b lim: 50 exec/s: 0 rss: 73Mb L: 40/49 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:31.876 [2024-12-17 01:24:17.684030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.876 [2024-12-17 01:24:17.684065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.684139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.876 [2024-12-17 01:24:17.684161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.684288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.876 [2024-12-17 01:24:17.684316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.876 #17 NEW cov: 12473 ft: 14248 corp: 13/517b lim: 50 exec/s: 17 rss: 73Mb L: 37/49 MS: 1 ChangeByte- 00:08:31.876 [2024-12-17 01:24:17.754211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.876 [2024-12-17 01:24:17.754245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.754372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.876 [2024-12-17 01:24:17.754395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.754523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.876 [2024-12-17 01:24:17.754552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.876 #18 NEW cov: 12473 ft: 14262 corp: 14/549b lim: 50 exec/s: 18 rss: 73Mb L: 32/49 MS: 1 EraseBytes- 00:08:31.876 [2024-12-17 01:24:17.824643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.876 [2024-12-17 01:24:17.824676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.824779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.876 [2024-12-17 01:24:17.824801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.824924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.876 [2024-12-17 01:24:17.824949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.825063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.876 [2024-12-17 01:24:17.825090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.876 #19 NEW cov: 12473 ft: 14271 corp: 15/590b lim: 50 exec/s: 19 rss: 73Mb L: 41/49 MS: 1 InsertByte- 00:08:31.876 [2024-12-17 01:24:17.874789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:31.876 [2024-12-17 01:24:17.874824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.874909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:31.876 [2024-12-17 01:24:17.874937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.875064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:31.876 [2024-12-17 01:24:17.875084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.876 [2024-12-17 01:24:17.875210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:31.876 [2024-12-17 01:24:17.875240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.136 #20 NEW cov: 12473 ft: 14297 corp: 16/637b lim: 50 exec/s: 20 rss: 73Mb L: 47/49 MS: 1 InsertRepeatedBytes- 00:08:32.136 [2024-12-17 01:24:17.924202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.136 [2024-12-17 01:24:17.924234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.136 #21 NEW cov: 12473 ft: 15107 corp: 17/655b lim: 50 exec/s: 21 rss: 73Mb L: 18/49 MS: 1 CrossOver- 00:08:32.136 [2024-12-17 01:24:17.995163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.136 [2024-12-17 01:24:17.995193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:17.995279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.136 [2024-12-17 01:24:17.995306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:17.995432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.136 [2024-12-17 01:24:17.995455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:17.995581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.136 [2024-12-17 01:24:17.995610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.136 #22 NEW cov: 12473 ft: 15118 corp: 18/695b lim: 50 exec/s: 22 rss: 73Mb L: 40/49 MS: 1 ChangeBinInt- 00:08:32.136 [2024-12-17 01:24:18.045295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.136 [2024-12-17 01:24:18.045326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:18.045394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.136 [2024-12-17 01:24:18.045416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:18.045537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.136 [2024-12-17 01:24:18.045566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:18.045696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.136 [2024-12-17 01:24:18.045715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.136 #23 NEW cov: 12473 ft: 15167 corp: 19/739b lim: 50 exec/s: 23 rss: 73Mb L: 44/49 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:32.136 [2024-12-17 01:24:18.115502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.136 [2024-12-17 01:24:18.115537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:18.115630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.136 [2024-12-17 01:24:18.115656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:18.115786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.136 [2024-12-17 01:24:18.115811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.136 [2024-12-17 01:24:18.115941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.136 [2024-12-17 01:24:18.115962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.136 #24 NEW cov: 12473 ft: 15195 corp: 20/788b lim: 50 exec/s: 24 rss: 73Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:08:32.395 [2024-12-17 01:24:18.165632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.395 [2024-12-17 01:24:18.165665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.395 [2024-12-17 01:24:18.165776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.395 [2024-12-17 01:24:18.165796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.395 [2024-12-17 01:24:18.165926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.395 [2024-12-17 01:24:18.165949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.395 [2024-12-17 01:24:18.166077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.395 [2024-12-17 01:24:18.166101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.395 #25 NEW cov: 12473 ft: 15212 corp: 21/828b lim: 50 exec/s: 25 rss: 73Mb L: 40/49 MS: 1 ChangeBinInt- 00:08:32.395 [2024-12-17 01:24:18.235812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.395 [2024-12-17 01:24:18.235842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.395 [2024-12-17 01:24:18.235916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.395 [2024-12-17 01:24:18.235944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.395 [2024-12-17 01:24:18.236070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.395 [2024-12-17 01:24:18.236095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.395 [2024-12-17 01:24:18.236220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.395 [2024-12-17 01:24:18.236248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.396 #26 NEW cov: 12473 ft: 15233 corp: 22/876b lim: 50 exec/s: 26 rss: 73Mb L: 48/49 MS: 1 CMP- DE: "\000\000\000\000\0029\274u"- 00:08:32.396 [2024-12-17 01:24:18.285707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.396 [2024-12-17 01:24:18.285744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.396 [2024-12-17 01:24:18.285870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.396 [2024-12-17 01:24:18.285894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.396 [2024-12-17 01:24:18.286022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.396 [2024-12-17 01:24:18.286053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.396 #27 NEW cov: 12473 ft: 15248 corp: 23/913b lim: 50 exec/s: 27 rss: 73Mb L: 37/49 MS: 1 ChangeBinInt- 00:08:32.396 [2024-12-17 01:24:18.336220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.396 [2024-12-17 01:24:18.336252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.396 [2024-12-17 01:24:18.336369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.396 [2024-12-17 01:24:18.336392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.396 [2024-12-17 01:24:18.336529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.396 [2024-12-17 01:24:18.336553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.396 #33 NEW cov: 12473 ft: 15316 corp: 24/950b lim: 50 exec/s: 33 rss: 73Mb L: 37/49 MS: 1 ShuffleBytes- 00:08:32.655 [2024-12-17 01:24:18.406154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.655 [2024-12-17 01:24:18.406188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.655 [2024-12-17 01:24:18.406308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.655 [2024-12-17 01:24:18.406337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.655 [2024-12-17 01:24:18.406459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.655 [2024-12-17 01:24:18.406486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.655 #34 NEW cov: 12473 ft: 15330 corp: 25/987b lim: 50 exec/s: 34 rss: 73Mb L: 37/49 MS: 1 ChangeBinInt- 00:08:32.655 [2024-12-17 01:24:18.456527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.655 [2024-12-17 01:24:18.456555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.655 [2024-12-17 01:24:18.456646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.655 [2024-12-17 01:24:18.456671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.655 [2024-12-17 01:24:18.456801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.655 [2024-12-17 01:24:18.456833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.655 [2024-12-17 01:24:18.456961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.655 [2024-12-17 01:24:18.456984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.655 #35 NEW cov: 12473 ft: 15345 corp: 26/1035b lim: 50 exec/s: 35 rss: 73Mb L: 48/49 MS: 1 ChangeBinInt- 00:08:32.655 [2024-12-17 01:24:18.526470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.655 [2024-12-17 01:24:18.526502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.655 [2024-12-17 01:24:18.526633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.655 [2024-12-17 01:24:18.526656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.655 [2024-12-17 01:24:18.526780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.655 [2024-12-17 01:24:18.526808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.655 #36 NEW cov: 12473 ft: 15350 corp: 27/1066b lim: 50 exec/s: 36 rss: 73Mb L: 31/49 MS: 1 EraseBytes- 00:08:32.656 [2024-12-17 01:24:18.576922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.656 [2024-12-17 01:24:18.576955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.656 [2024-12-17 01:24:18.577055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.656 [2024-12-17 01:24:18.577082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.656 [2024-12-17 01:24:18.577214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.656 [2024-12-17 01:24:18.577234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.656 [2024-12-17 01:24:18.577370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.656 [2024-12-17 01:24:18.577399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.656 #37 NEW cov: 12473 ft: 15377 corp: 28/1110b lim: 50 exec/s: 37 rss: 74Mb L: 44/49 MS: 1 ChangeBinInt- 00:08:32.656 [2024-12-17 01:24:18.647137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.656 [2024-12-17 01:24:18.647176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.656 [2024-12-17 01:24:18.647240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.656 [2024-12-17 01:24:18.647268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.656 [2024-12-17 01:24:18.647398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.656 [2024-12-17 01:24:18.647427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.656 [2024-12-17 01:24:18.647555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.656 [2024-12-17 01:24:18.647582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.915 #38 NEW cov: 12473 ft: 15416 corp: 29/1150b lim: 50 exec/s: 38 rss: 74Mb L: 40/49 MS: 1 CrossOver- 00:08:32.915 [2024-12-17 01:24:18.717587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.915 [2024-12-17 01:24:18.717626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.915 [2024-12-17 01:24:18.717755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.915 [2024-12-17 01:24:18.717783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.915 [2024-12-17 01:24:18.717917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.915 [2024-12-17 01:24:18.717942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.915 [2024-12-17 01:24:18.718071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.915 [2024-12-17 01:24:18.718095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.915 #39 NEW cov: 12473 ft: 15428 corp: 30/1199b lim: 50 exec/s: 19 rss: 74Mb L: 49/49 MS: 1 ChangeBit- 00:08:32.915 #39 DONE cov: 12473 ft: 15428 corp: 30/1199b lim: 50 exec/s: 19 rss: 74Mb 00:08:32.915 ###### Recommended dictionary. ###### 00:08:32.915 "\377\377\377\377" # Uses: 0 00:08:32.915 "\000\000\000\000\000\000\000\015" # Uses: 0 00:08:32.915 "\000\000\000\007" # Uses: 2 00:08:32.915 "\000\000\000\000\0029\274u" # Uses: 0 00:08:32.915 ###### End of recommended dictionary. ###### 00:08:32.915 Done 39 runs in 2 second(s) 00:08:32.915 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:32.915 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:32.915 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.915 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:32.916 01:24:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:32.916 [2024-12-17 01:24:18.911592] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:32.916 [2024-12-17 01:24:18.911680] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid833450 ] 00:08:33.175 [2024-12-17 01:24:19.089263] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.175 [2024-12-17 01:24:19.110610] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.175 [2024-12-17 01:24:19.163258] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.434 [2024-12-17 01:24:19.179567] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:33.434 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.434 INFO: Seed: 4218510440 00:08:33.434 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:33.434 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:33.434 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:33.434 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.434 #2 INITED exec/s: 0 rss: 65Mb 00:08:33.434 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.434 This may also happen if the target rejected all inputs we tried so far 00:08:33.434 [2024-12-17 01:24:19.249032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.434 [2024-12-17 01:24:19.249073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.693 NEW_FUNC[1/715]: 0x479fb8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:33.693 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.693 #9 NEW cov: 12253 ft: 12249 corp: 2/25b lim: 85 exec/s: 0 rss: 72Mb L: 24/24 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:33.693 [2024-12-17 01:24:19.589651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.693 [2024-12-17 01:24:19.589697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.693 NEW_FUNC[1/1]: 0x19862b8 in nvme_get_transport /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:56 00:08:33.693 #10 NEW cov: 12385 ft: 13027 corp: 3/49b lim: 85 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 ShuffleBytes- 00:08:33.693 [2024-12-17 01:24:19.649843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.693 [2024-12-17 01:24:19.649871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.693 #13 NEW cov: 12391 ft: 13298 corp: 4/75b lim: 85 exec/s: 0 rss: 72Mb L: 26/26 MS: 3 CopyPart-ChangeByte-CrossOver- 00:08:33.693 [2024-12-17 01:24:19.690026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.693 [2024-12-17 01:24:19.690052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.952 #14 NEW cov: 12476 ft: 13517 corp: 5/101b lim: 85 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 CMP- DE: "\377\007"- 00:08:33.952 [2024-12-17 01:24:19.760130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.952 [2024-12-17 01:24:19.760165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.952 #15 NEW cov: 12476 ft: 13564 corp: 6/127b lim: 85 exec/s: 0 rss: 73Mb L: 26/26 MS: 1 PersAutoDict- DE: "\377\007"- 00:08:33.952 [2024-12-17 01:24:19.800232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.952 [2024-12-17 01:24:19.800258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.952 #21 NEW cov: 12476 ft: 13599 corp: 7/151b lim: 85 exec/s: 0 rss: 73Mb L: 24/26 MS: 1 CopyPart- 00:08:33.952 [2024-12-17 01:24:19.860747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.952 [2024-12-17 01:24:19.860777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.952 [2024-12-17 01:24:19.860905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.952 [2024-12-17 01:24:19.860927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.952 #22 NEW cov: 12476 ft: 14428 corp: 8/201b lim: 85 exec/s: 0 rss: 73Mb L: 50/50 MS: 1 CrossOver- 00:08:33.952 [2024-12-17 01:24:19.920843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:33.952 [2024-12-17 01:24:19.920877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.952 [2024-12-17 01:24:19.920996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:33.952 [2024-12-17 01:24:19.921021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.952 #28 NEW cov: 12476 ft: 14448 corp: 9/245b lim: 85 exec/s: 0 rss: 73Mb L: 44/50 MS: 1 CopyPart- 00:08:34.211 [2024-12-17 01:24:19.970685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.211 [2024-12-17 01:24:19.970712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.211 #29 NEW cov: 12476 ft: 14470 corp: 10/269b lim: 85 exec/s: 0 rss: 73Mb L: 24/50 MS: 1 CrossOver- 00:08:34.211 [2024-12-17 01:24:20.040962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.211 [2024-12-17 01:24:20.040993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.211 #30 NEW cov: 12476 ft: 14543 corp: 11/288b lim: 85 exec/s: 0 rss: 73Mb L: 19/50 MS: 1 EraseBytes- 00:08:34.211 [2024-12-17 01:24:20.091140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.211 [2024-12-17 01:24:20.091169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.211 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:34.211 #31 NEW cov: 12499 ft: 14586 corp: 12/314b lim: 85 exec/s: 0 rss: 73Mb L: 26/50 MS: 1 ChangeBinInt- 00:08:34.211 [2024-12-17 01:24:20.161622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.211 [2024-12-17 01:24:20.161656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.211 [2024-12-17 01:24:20.161786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.211 [2024-12-17 01:24:20.161808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.211 #32 NEW cov: 12499 ft: 14631 corp: 13/360b lim: 85 exec/s: 0 rss: 73Mb L: 46/50 MS: 1 CopyPart- 00:08:34.212 [2024-12-17 01:24:20.211820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.212 [2024-12-17 01:24:20.211853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.212 [2024-12-17 01:24:20.211964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.212 [2024-12-17 01:24:20.211986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.471 #33 NEW cov: 12499 ft: 14651 corp: 14/404b lim: 85 exec/s: 33 rss: 73Mb L: 44/50 MS: 1 PersAutoDict- DE: "\377\007"- 00:08:34.471 [2024-12-17 01:24:20.282254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.471 [2024-12-17 01:24:20.282284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.471 [2024-12-17 01:24:20.282407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.471 [2024-12-17 01:24:20.282429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.471 [2024-12-17 01:24:20.282545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.471 [2024-12-17 01:24:20.282570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.471 #35 NEW cov: 12499 ft: 15006 corp: 15/457b lim: 85 exec/s: 35 rss: 73Mb L: 53/53 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:34.471 [2024-12-17 01:24:20.332070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.471 [2024-12-17 01:24:20.332099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.471 [2024-12-17 01:24:20.332219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.471 [2024-12-17 01:24:20.332244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.471 #36 NEW cov: 12499 ft: 15026 corp: 16/501b lim: 85 exec/s: 36 rss: 73Mb L: 44/53 MS: 1 ChangeBinInt- 00:08:34.471 [2024-12-17 01:24:20.382059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.471 [2024-12-17 01:24:20.382085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.471 #37 NEW cov: 12499 ft: 15038 corp: 17/520b lim: 85 exec/s: 37 rss: 73Mb L: 19/53 MS: 1 ChangeBit- 00:08:34.471 [2024-12-17 01:24:20.452914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.471 [2024-12-17 01:24:20.452949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.471 [2024-12-17 01:24:20.453062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.471 [2024-12-17 01:24:20.453094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.471 [2024-12-17 01:24:20.453239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.471 [2024-12-17 01:24:20.453259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.471 [2024-12-17 01:24:20.453389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:34.471 [2024-12-17 01:24:20.453417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.731 #38 NEW cov: 12499 ft: 15451 corp: 18/596b lim: 85 exec/s: 38 rss: 73Mb L: 76/76 MS: 1 InsertRepeatedBytes- 00:08:34.731 [2024-12-17 01:24:20.502734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.731 [2024-12-17 01:24:20.502765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.731 [2024-12-17 01:24:20.502888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.731 [2024-12-17 01:24:20.502912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.731 [2024-12-17 01:24:20.503040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.731 [2024-12-17 01:24:20.503065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.731 #42 NEW cov: 12499 ft: 15468 corp: 19/659b lim: 85 exec/s: 42 rss: 73Mb L: 63/76 MS: 4 ChangeBinInt-CrossOver-PersAutoDict-InsertRepeatedBytes- DE: "\377\007"- 00:08:34.731 [2024-12-17 01:24:20.552472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.731 [2024-12-17 01:24:20.552506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.731 #43 NEW cov: 12499 ft: 15491 corp: 20/683b lim: 85 exec/s: 43 rss: 73Mb L: 24/76 MS: 1 CopyPart- 00:08:34.731 [2024-12-17 01:24:20.622684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.731 [2024-12-17 01:24:20.622712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.731 #44 NEW cov: 12499 ft: 15497 corp: 21/709b lim: 85 exec/s: 44 rss: 73Mb L: 26/76 MS: 1 ChangeByte- 00:08:34.731 [2024-12-17 01:24:20.673065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.731 [2024-12-17 01:24:20.673094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.731 [2024-12-17 01:24:20.673217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.731 [2024-12-17 01:24:20.673241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.731 #45 NEW cov: 12499 ft: 15520 corp: 22/753b lim: 85 exec/s: 45 rss: 73Mb L: 44/76 MS: 1 ChangeByte- 00:08:34.990 [2024-12-17 01:24:20.743299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.990 [2024-12-17 01:24:20.743327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.990 [2024-12-17 01:24:20.743449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.990 [2024-12-17 01:24:20.743474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.990 #46 NEW cov: 12499 ft: 15558 corp: 23/789b lim: 85 exec/s: 46 rss: 74Mb L: 36/76 MS: 1 CopyPart- 00:08:34.990 [2024-12-17 01:24:20.793439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.990 [2024-12-17 01:24:20.793470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.990 [2024-12-17 01:24:20.793602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.990 [2024-12-17 01:24:20.793625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.990 #47 NEW cov: 12499 ft: 15629 corp: 24/833b lim: 85 exec/s: 47 rss: 74Mb L: 44/76 MS: 1 ChangeBinInt- 00:08:34.990 [2024-12-17 01:24:20.843297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.990 [2024-12-17 01:24:20.843326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.990 #48 NEW cov: 12499 ft: 15652 corp: 25/857b lim: 85 exec/s: 48 rss: 74Mb L: 24/76 MS: 1 ChangeBinInt- 00:08:34.990 [2024-12-17 01:24:20.893770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.990 [2024-12-17 01:24:20.893810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.990 [2024-12-17 01:24:20.893947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.990 [2024-12-17 01:24:20.893977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.990 #49 NEW cov: 12499 ft: 15694 corp: 26/901b lim: 85 exec/s: 49 rss: 74Mb L: 44/76 MS: 1 ChangeBit- 00:08:34.990 [2024-12-17 01:24:20.963999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.990 [2024-12-17 01:24:20.964030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.990 [2024-12-17 01:24:20.964159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.990 [2024-12-17 01:24:20.964184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.262 #50 NEW cov: 12499 ft: 15711 corp: 27/938b lim: 85 exec/s: 50 rss: 74Mb L: 37/76 MS: 1 InsertRepeatedBytes- 00:08:35.262 [2024-12-17 01:24:21.034145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.262 [2024-12-17 01:24:21.034172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.262 [2024-12-17 01:24:21.034304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.262 [2024-12-17 01:24:21.034326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.262 #51 NEW cov: 12499 ft: 15718 corp: 28/983b lim: 85 exec/s: 51 rss: 74Mb L: 45/76 MS: 1 InsertByte- 00:08:35.262 [2024-12-17 01:24:21.084262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.262 [2024-12-17 01:24:21.084292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.262 [2024-12-17 01:24:21.084409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.262 [2024-12-17 01:24:21.084428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.262 #52 NEW cov: 12499 ft: 15725 corp: 29/1027b lim: 85 exec/s: 52 rss: 74Mb L: 44/76 MS: 1 CopyPart- 00:08:35.262 [2024-12-17 01:24:21.134118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.262 [2024-12-17 01:24:21.134144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.262 #53 NEW cov: 12499 ft: 15737 corp: 30/1051b lim: 85 exec/s: 53 rss: 74Mb L: 24/76 MS: 1 ShuffleBytes- 00:08:35.262 [2024-12-17 01:24:21.204755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.262 [2024-12-17 01:24:21.204781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.262 [2024-12-17 01:24:21.204905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.262 [2024-12-17 01:24:21.204934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.262 #54 NEW cov: 12499 ft: 15787 corp: 31/1088b lim: 85 exec/s: 27 rss: 74Mb L: 37/76 MS: 1 CrossOver- 00:08:35.262 #54 DONE cov: 12499 ft: 15787 corp: 31/1088b lim: 85 exec/s: 27 rss: 74Mb 00:08:35.262 ###### Recommended dictionary. ###### 00:08:35.262 "\377\007" # Uses: 3 00:08:35.262 ###### End of recommended dictionary. ###### 00:08:35.262 Done 54 runs in 2 second(s) 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:35.521 01:24:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:35.521 [2024-12-17 01:24:21.371800] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:35.521 [2024-12-17 01:24:21.371866] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid833985 ] 00:08:35.780 [2024-12-17 01:24:21.545659] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.780 [2024-12-17 01:24:21.567118] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.780 [2024-12-17 01:24:21.619375] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:35.780 [2024-12-17 01:24:21.635652] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:35.780 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.780 INFO: Seed: 2381545579 00:08:35.780 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:35.780 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:35.780 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:35.780 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.780 #2 INITED exec/s: 0 rss: 65Mb 00:08:35.780 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.780 This may also happen if the target rejected all inputs we tried so far 00:08:35.780 [2024-12-17 01:24:21.680398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:35.780 [2024-12-17 01:24:21.680431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.040 NEW_FUNC[1/715]: 0x47d1f8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:36.040 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:36.040 #4 NEW cov: 12205 ft: 12194 corp: 2/6b lim: 25 exec/s: 0 rss: 72Mb L: 5/5 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:36.040 [2024-12-17 01:24:22.042833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.040 [2024-12-17 01:24:22.042889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.299 #5 NEW cov: 12318 ft: 13008 corp: 3/11b lim: 25 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 ChangeByte- 00:08:36.299 [2024-12-17 01:24:22.113589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.299 [2024-12-17 01:24:22.113629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.299 [2024-12-17 01:24:22.113752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.299 [2024-12-17 01:24:22.113781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.299 [2024-12-17 01:24:22.113922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.299 [2024-12-17 01:24:22.113947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.299 [2024-12-17 01:24:22.114089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.299 [2024-12-17 01:24:22.114114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.299 #6 NEW cov: 12324 ft: 13776 corp: 4/32b lim: 25 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:08:36.299 [2024-12-17 01:24:22.163082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.299 [2024-12-17 01:24:22.163115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.299 #7 NEW cov: 12409 ft: 14059 corp: 5/37b lim: 25 exec/s: 0 rss: 72Mb L: 5/21 MS: 1 ShuffleBytes- 00:08:36.299 [2024-12-17 01:24:22.213497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.299 [2024-12-17 01:24:22.213530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.299 [2024-12-17 01:24:22.213665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.299 [2024-12-17 01:24:22.213686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.299 #13 NEW cov: 12409 ft: 14340 corp: 6/47b lim: 25 exec/s: 0 rss: 72Mb L: 10/21 MS: 1 CrossOver- 00:08:36.299 [2024-12-17 01:24:22.284169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.299 [2024-12-17 01:24:22.284199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.299 [2024-12-17 01:24:22.284285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.299 [2024-12-17 01:24:22.284310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.299 [2024-12-17 01:24:22.284453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.299 [2024-12-17 01:24:22.284479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.299 [2024-12-17 01:24:22.284619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.299 [2024-12-17 01:24:22.284646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.559 #14 NEW cov: 12409 ft: 14405 corp: 7/69b lim: 25 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:36.559 [2024-12-17 01:24:22.334074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.559 [2024-12-17 01:24:22.334101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.559 [2024-12-17 01:24:22.334246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.559 [2024-12-17 01:24:22.334265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.559 #18 NEW cov: 12409 ft: 14518 corp: 8/80b lim: 25 exec/s: 0 rss: 72Mb L: 11/22 MS: 4 ChangeBit-CopyPart-ChangeByte-CrossOver- 00:08:36.559 [2024-12-17 01:24:22.384003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.559 [2024-12-17 01:24:22.384036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.559 #19 NEW cov: 12409 ft: 14570 corp: 9/87b lim: 25 exec/s: 0 rss: 72Mb L: 7/22 MS: 1 InsertRepeatedBytes- 00:08:36.559 [2024-12-17 01:24:22.434185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.559 [2024-12-17 01:24:22.434217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.559 #20 NEW cov: 12409 ft: 14595 corp: 10/92b lim: 25 exec/s: 0 rss: 72Mb L: 5/22 MS: 1 ShuffleBytes- 00:08:36.559 [2024-12-17 01:24:22.505128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.559 [2024-12-17 01:24:22.505158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.559 [2024-12-17 01:24:22.505252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.559 [2024-12-17 01:24:22.505278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.559 [2024-12-17 01:24:22.505419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.559 [2024-12-17 01:24:22.505445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.559 [2024-12-17 01:24:22.505585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.559 [2024-12-17 01:24:22.505612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.559 #26 NEW cov: 12409 ft: 14619 corp: 11/113b lim: 25 exec/s: 0 rss: 72Mb L: 21/22 MS: 1 CrossOver- 00:08:36.559 [2024-12-17 01:24:22.554846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.559 [2024-12-17 01:24:22.554878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.559 [2024-12-17 01:24:22.554980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.559 [2024-12-17 01:24:22.555000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.819 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:36.819 #27 NEW cov: 12426 ft: 14668 corp: 12/127b lim: 25 exec/s: 0 rss: 72Mb L: 14/22 MS: 1 EraseBytes- 00:08:36.819 [2024-12-17 01:24:22.625287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.819 [2024-12-17 01:24:22.625321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.819 [2024-12-17 01:24:22.625439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.819 [2024-12-17 01:24:22.625465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.819 [2024-12-17 01:24:22.625605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.819 [2024-12-17 01:24:22.625634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.819 #28 NEW cov: 12426 ft: 14880 corp: 13/144b lim: 25 exec/s: 0 rss: 73Mb L: 17/22 MS: 1 EraseBytes- 00:08:36.819 [2024-12-17 01:24:22.695114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.819 [2024-12-17 01:24:22.695148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.819 #29 NEW cov: 12426 ft: 14924 corp: 14/150b lim: 25 exec/s: 29 rss: 73Mb L: 6/22 MS: 1 InsertByte- 00:08:36.819 [2024-12-17 01:24:22.766033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.819 [2024-12-17 01:24:22.766065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.819 [2024-12-17 01:24:22.766147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.819 [2024-12-17 01:24:22.766170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.819 [2024-12-17 01:24:22.766308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:36.819 [2024-12-17 01:24:22.766335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.819 [2024-12-17 01:24:22.766487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:36.819 [2024-12-17 01:24:22.766516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.819 #30 NEW cov: 12426 ft: 14974 corp: 15/172b lim: 25 exec/s: 30 rss: 73Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:37.078 [2024-12-17 01:24:22.836112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.078 [2024-12-17 01:24:22.836147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.078 [2024-12-17 01:24:22.836229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.078 [2024-12-17 01:24:22.836256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.078 [2024-12-17 01:24:22.836407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.078 [2024-12-17 01:24:22.836435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.078 #31 NEW cov: 12426 ft: 14988 corp: 16/191b lim: 25 exec/s: 31 rss: 73Mb L: 19/22 MS: 1 EraseBytes- 00:08:37.078 [2024-12-17 01:24:22.885787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.078 [2024-12-17 01:24:22.885826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.078 #32 NEW cov: 12426 ft: 14994 corp: 17/196b lim: 25 exec/s: 32 rss: 73Mb L: 5/22 MS: 1 ShuffleBytes- 00:08:37.078 [2024-12-17 01:24:22.936852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.078 [2024-12-17 01:24:22.936886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.078 [2024-12-17 01:24:22.937023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.079 [2024-12-17 01:24:22.937051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.079 [2024-12-17 01:24:22.937195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.079 [2024-12-17 01:24:22.937225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.079 [2024-12-17 01:24:22.937366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.079 [2024-12-17 01:24:22.937393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.079 #38 NEW cov: 12426 ft: 15057 corp: 18/216b lim: 25 exec/s: 38 rss: 73Mb L: 20/22 MS: 1 InsertRepeatedBytes- 00:08:37.079 [2024-12-17 01:24:23.006485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.079 [2024-12-17 01:24:23.006522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.079 [2024-12-17 01:24:23.006643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.079 [2024-12-17 01:24:23.006668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.079 [2024-12-17 01:24:23.006796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.079 [2024-12-17 01:24:23.006819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.079 #39 NEW cov: 12426 ft: 15067 corp: 19/232b lim: 25 exec/s: 39 rss: 73Mb L: 16/22 MS: 1 EraseBytes- 00:08:37.079 [2024-12-17 01:24:23.056685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.079 [2024-12-17 01:24:23.056716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.079 [2024-12-17 01:24:23.056856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.079 [2024-12-17 01:24:23.056881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.079 #40 NEW cov: 12426 ft: 15078 corp: 20/245b lim: 25 exec/s: 40 rss: 73Mb L: 13/22 MS: 1 CrossOver- 00:08:37.338 [2024-12-17 01:24:23.107351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.338 [2024-12-17 01:24:23.107382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.338 [2024-12-17 01:24:23.107488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.338 [2024-12-17 01:24:23.107513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.338 [2024-12-17 01:24:23.107649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.338 [2024-12-17 01:24:23.107671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.338 [2024-12-17 01:24:23.107805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.338 [2024-12-17 01:24:23.107831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.338 #41 NEW cov: 12426 ft: 15156 corp: 21/267b lim: 25 exec/s: 41 rss: 73Mb L: 22/22 MS: 1 ChangeByte- 00:08:37.338 [2024-12-17 01:24:23.157460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.338 [2024-12-17 01:24:23.157496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.338 [2024-12-17 01:24:23.157605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.338 [2024-12-17 01:24:23.157623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.338 [2024-12-17 01:24:23.157756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.338 [2024-12-17 01:24:23.157777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.338 [2024-12-17 01:24:23.157933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.338 [2024-12-17 01:24:23.157955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.338 #42 NEW cov: 12426 ft: 15204 corp: 22/288b lim: 25 exec/s: 42 rss: 73Mb L: 21/22 MS: 1 ChangeBinInt- 00:08:37.338 [2024-12-17 01:24:23.227299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.338 [2024-12-17 01:24:23.227329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.338 [2024-12-17 01:24:23.227468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.338 [2024-12-17 01:24:23.227494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.338 #43 NEW cov: 12426 ft: 15216 corp: 23/298b lim: 25 exec/s: 43 rss: 73Mb L: 10/22 MS: 1 ChangeByte- 00:08:37.338 [2024-12-17 01:24:23.277265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.339 [2024-12-17 01:24:23.277301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.339 [2024-12-17 01:24:23.277442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.339 [2024-12-17 01:24:23.277471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.339 #44 NEW cov: 12426 ft: 15239 corp: 24/309b lim: 25 exec/s: 44 rss: 73Mb L: 11/22 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:37.598 [2024-12-17 01:24:23.348127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.598 [2024-12-17 01:24:23.348158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.348239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.598 [2024-12-17 01:24:23.348266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.348402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.598 [2024-12-17 01:24:23.348429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.348560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.598 [2024-12-17 01:24:23.348585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.598 #45 NEW cov: 12426 ft: 15243 corp: 25/331b lim: 25 exec/s: 45 rss: 73Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:37.598 [2024-12-17 01:24:23.418347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.598 [2024-12-17 01:24:23.418378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.418472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.598 [2024-12-17 01:24:23.418497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.418645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.598 [2024-12-17 01:24:23.418672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.418828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.598 [2024-12-17 01:24:23.418854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.598 #46 NEW cov: 12426 ft: 15299 corp: 26/352b lim: 25 exec/s: 46 rss: 73Mb L: 21/22 MS: 1 ChangeBit- 00:08:37.598 [2024-12-17 01:24:23.468519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.598 [2024-12-17 01:24:23.468550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.468659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.598 [2024-12-17 01:24:23.468689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.468827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.598 [2024-12-17 01:24:23.468847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.468983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.598 [2024-12-17 01:24:23.469003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.598 #47 NEW cov: 12426 ft: 15361 corp: 27/375b lim: 25 exec/s: 47 rss: 73Mb L: 23/23 MS: 1 CopyPart- 00:08:37.598 [2024-12-17 01:24:23.539072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.598 [2024-12-17 01:24:23.539107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.598 [2024-12-17 01:24:23.539207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.598 [2024-12-17 01:24:23.539233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.599 [2024-12-17 01:24:23.539366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.599 [2024-12-17 01:24:23.539391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.599 [2024-12-17 01:24:23.539538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.599 [2024-12-17 01:24:23.539560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.599 [2024-12-17 01:24:23.539697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:37.599 [2024-12-17 01:24:23.539723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:37.599 #48 NEW cov: 12433 ft: 15389 corp: 28/400b lim: 25 exec/s: 48 rss: 73Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:37.858 [2024-12-17 01:24:23.609124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.858 [2024-12-17 01:24:23.609159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.858 [2024-12-17 01:24:23.609278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.858 [2024-12-17 01:24:23.609304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.858 [2024-12-17 01:24:23.609450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.858 [2024-12-17 01:24:23.609476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.858 [2024-12-17 01:24:23.609619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.858 [2024-12-17 01:24:23.609644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.858 #49 NEW cov: 12433 ft: 15408 corp: 29/421b lim: 25 exec/s: 49 rss: 74Mb L: 21/25 MS: 1 CrossOver- 00:08:37.858 [2024-12-17 01:24:23.679200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.858 [2024-12-17 01:24:23.679234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.858 [2024-12-17 01:24:23.679349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.858 [2024-12-17 01:24:23.679369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.858 [2024-12-17 01:24:23.679498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.858 [2024-12-17 01:24:23.679524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.858 #50 NEW cov: 12433 ft: 15424 corp: 30/438b lim: 25 exec/s: 25 rss: 74Mb L: 17/25 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:37.858 #50 DONE cov: 12433 ft: 15424 corp: 30/438b lim: 25 exec/s: 25 rss: 74Mb 00:08:37.858 ###### Recommended dictionary. ###### 00:08:37.858 "\001\000\000\000" # Uses: 1 00:08:37.858 ###### End of recommended dictionary. ###### 00:08:37.858 Done 50 runs in 2 second(s) 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:37.858 01:24:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:37.858 [2024-12-17 01:24:23.860166] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:37.858 [2024-12-17 01:24:23.860218] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid834407 ] 00:08:38.117 [2024-12-17 01:24:24.030055] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.117 [2024-12-17 01:24:24.051741] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.117 [2024-12-17 01:24:24.104060] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.117 [2024-12-17 01:24:24.120355] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:38.376 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.376 INFO: Seed: 571571115 00:08:38.376 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:38.376 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:38.376 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:38.376 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.376 #2 INITED exec/s: 0 rss: 64Mb 00:08:38.376 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.376 This may also happen if the target rejected all inputs we tried so far 00:08:38.376 [2024-12-17 01:24:24.165827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.376 [2024-12-17 01:24:24.165870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.376 [2024-12-17 01:24:24.165921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.376 [2024-12-17 01:24:24.165938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.635 NEW_FUNC[1/716]: 0x47e2e8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:38.635 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:38.635 #3 NEW cov: 12277 ft: 12275 corp: 2/53b lim: 100 exec/s: 0 rss: 72Mb L: 52/52 MS: 1 InsertRepeatedBytes- 00:08:38.635 [2024-12-17 01:24:24.496598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752012334148 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.635 [2024-12-17 01:24:24.496632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.635 [2024-12-17 01:24:24.496688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.635 [2024-12-17 01:24:24.496704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.635 #4 NEW cov: 12390 ft: 12893 corp: 3/93b lim: 100 exec/s: 0 rss: 72Mb L: 40/52 MS: 1 CrossOver- 00:08:38.635 [2024-12-17 01:24:24.536623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.635 [2024-12-17 01:24:24.536653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.635 [2024-12-17 01:24:24.536696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.635 [2024-12-17 01:24:24.536713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.635 #10 NEW cov: 12396 ft: 13118 corp: 4/145b lim: 100 exec/s: 0 rss: 72Mb L: 52/52 MS: 1 CopyPart- 00:08:38.635 [2024-12-17 01:24:24.596830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.635 [2024-12-17 01:24:24.596856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.635 [2024-12-17 01:24:24.596893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.635 [2024-12-17 01:24:24.596909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.635 #11 NEW cov: 12481 ft: 13359 corp: 5/196b lim: 100 exec/s: 0 rss: 72Mb L: 51/52 MS: 1 EraseBytes- 00:08:38.635 [2024-12-17 01:24:24.636933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752012334148 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.636 [2024-12-17 01:24:24.636960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.636 [2024-12-17 01:24:24.637000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.636 [2024-12-17 01:24:24.637017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.895 #12 NEW cov: 12481 ft: 13498 corp: 6/237b lim: 100 exec/s: 0 rss: 72Mb L: 41/52 MS: 1 InsertByte- 00:08:38.895 [2024-12-17 01:24:24.697102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.697129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.697168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.697184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.895 #13 NEW cov: 12481 ft: 13593 corp: 7/289b lim: 100 exec/s: 0 rss: 72Mb L: 52/52 MS: 1 ShuffleBytes- 00:08:38.895 [2024-12-17 01:24:24.737660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:217020520795931395 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.737687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.737739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:217020518514230019 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.737756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.737813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:217020518514230019 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.737846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.737902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:217020518514230019 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.737918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.895 #18 NEW cov: 12481 ft: 14118 corp: 8/379b lim: 100 exec/s: 0 rss: 72Mb L: 90/90 MS: 5 ChangeBit-ChangeByte-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:38.895 [2024-12-17 01:24:24.777506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.777537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.777583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131890428167236 len:69 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.777600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.895 #19 NEW cov: 12481 ft: 14153 corp: 9/434b lim: 100 exec/s: 0 rss: 72Mb L: 55/90 MS: 1 CMP- DE: "d\000\000\000"- 00:08:38.895 [2024-12-17 01:24:24.837969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.837996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.838048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.838066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.838118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.838133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.838190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.838206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.895 #20 NEW cov: 12481 ft: 14176 corp: 10/522b lim: 100 exec/s: 0 rss: 72Mb L: 88/90 MS: 1 CopyPart- 00:08:38.895 [2024-12-17 01:24:24.897838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4991189346054063172 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.897881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.895 [2024-12-17 01:24:24.897924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:38.895 [2024-12-17 01:24:24.897941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.154 #21 NEW cov: 12481 ft: 14214 corp: 11/573b lim: 100 exec/s: 0 rss: 72Mb L: 51/90 MS: 1 ChangeBit- 00:08:39.154 [2024-12-17 01:24:24.937919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752012334148 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:24.937945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.154 [2024-12-17 01:24:24.937995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:24.938010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.154 #22 NEW cov: 12481 ft: 14229 corp: 12/614b lim: 100 exec/s: 0 rss: 72Mb L: 41/90 MS: 1 ChangeBit- 00:08:39.154 [2024-12-17 01:24:24.998407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4991189346054063172 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:24.998434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.154 [2024-12-17 01:24:24.998477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:24.998494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.154 [2024-12-17 01:24:24.998549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:24.998565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.154 [2024-12-17 01:24:24.998623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:24.998638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.154 #23 NEW cov: 12481 ft: 14234 corp: 13/697b lim: 100 exec/s: 0 rss: 72Mb L: 83/90 MS: 1 CopyPart- 00:08:39.154 [2024-12-17 01:24:25.058277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:25.058303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.154 [2024-12-17 01:24:25.058342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:25.058358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.154 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:39.154 #24 NEW cov: 12504 ft: 14285 corp: 14/748b lim: 100 exec/s: 0 rss: 73Mb L: 51/90 MS: 1 CopyPart- 00:08:39.154 [2024-12-17 01:24:25.098662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:25.098688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.154 [2024-12-17 01:24:25.098735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:25.098755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.154 [2024-12-17 01:24:25.098811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:25.098827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.154 [2024-12-17 01:24:25.098884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.154 [2024-12-17 01:24:25.098900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.154 #25 NEW cov: 12504 ft: 14347 corp: 15/836b lim: 100 exec/s: 0 rss: 73Mb L: 88/90 MS: 1 PersAutoDict- DE: "d\000\000\000"- 00:08:39.413 [2024-12-17 01:24:25.158864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.158892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.413 [2024-12-17 01:24:25.158946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.158965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.413 [2024-12-17 01:24:25.159019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.159034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.413 [2024-12-17 01:24:25.159090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.159105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.413 #26 NEW cov: 12504 ft: 14367 corp: 16/924b lim: 100 exec/s: 26 rss: 73Mb L: 88/90 MS: 1 ChangeByte- 00:08:39.413 [2024-12-17 01:24:25.218543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.218571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.413 #27 NEW cov: 12504 ft: 15120 corp: 17/951b lim: 100 exec/s: 27 rss: 73Mb L: 27/90 MS: 1 EraseBytes- 00:08:39.413 [2024-12-17 01:24:25.258801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.258827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.413 [2024-12-17 01:24:25.258867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.258884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.413 #28 NEW cov: 12504 ft: 15141 corp: 18/1003b lim: 100 exec/s: 28 rss: 73Mb L: 52/90 MS: 1 ChangeByte- 00:08:39.413 [2024-12-17 01:24:25.319303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:217020520795931395 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.319330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.413 [2024-12-17 01:24:25.319385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:217020518514230019 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.319401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.413 [2024-12-17 01:24:25.319455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:217020518514230019 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.413 [2024-12-17 01:24:25.319470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.414 [2024-12-17 01:24:25.319524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:217020518514230019 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.414 [2024-12-17 01:24:25.319556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.414 #29 NEW cov: 12504 ft: 15158 corp: 19/1093b lim: 100 exec/s: 29 rss: 73Mb L: 90/90 MS: 1 ChangeBinInt- 00:08:39.414 [2024-12-17 01:24:25.379138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:6576737889865188420 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.414 [2024-12-17 01:24:25.379165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.414 [2024-12-17 01:24:25.379222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.414 [2024-12-17 01:24:25.379240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.414 #30 NEW cov: 12504 ft: 15171 corp: 20/1145b lim: 100 exec/s: 30 rss: 73Mb L: 52/90 MS: 1 InsertByte- 00:08:39.672 [2024-12-17 01:24:25.419276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.672 [2024-12-17 01:24:25.419305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.672 [2024-12-17 01:24:25.419353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.672 [2024-12-17 01:24:25.419370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.672 #31 NEW cov: 12504 ft: 15196 corp: 21/1197b lim: 100 exec/s: 31 rss: 73Mb L: 52/90 MS: 1 ShuffleBytes- 00:08:39.672 [2024-12-17 01:24:25.459367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:1093 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.672 [2024-12-17 01:24:25.459392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.672 [2024-12-17 01:24:25.459431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.672 [2024-12-17 01:24:25.459448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.672 #32 NEW cov: 12504 ft: 15197 corp: 22/1249b lim: 100 exec/s: 32 rss: 73Mb L: 52/90 MS: 1 ChangeBit- 00:08:39.672 [2024-12-17 01:24:25.499341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.672 [2024-12-17 01:24:25.499369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.672 #33 NEW cov: 12504 ft: 15210 corp: 23/1284b lim: 100 exec/s: 33 rss: 73Mb L: 35/90 MS: 1 EraseBytes- 00:08:39.672 [2024-12-17 01:24:25.559674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752012334150 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.559702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.673 [2024-12-17 01:24:25.559751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.559767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.673 #34 NEW cov: 12504 ft: 15231 corp: 24/1324b lim: 100 exec/s: 34 rss: 73Mb L: 40/90 MS: 1 ChangeBinInt- 00:08:39.673 [2024-12-17 01:24:25.599957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.599983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.673 [2024-12-17 01:24:25.600025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.600040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.673 [2024-12-17 01:24:25.600096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.600117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.673 #35 NEW cov: 12504 ft: 15545 corp: 25/1393b lim: 100 exec/s: 35 rss: 73Mb L: 69/90 MS: 1 EraseBytes- 00:08:39.673 [2024-12-17 01:24:25.640224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.640251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.673 [2024-12-17 01:24:25.640304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.640320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.673 [2024-12-17 01:24:25.640376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.640390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.673 [2024-12-17 01:24:25.640446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.673 [2024-12-17 01:24:25.640462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.673 #36 NEW cov: 12504 ft: 15554 corp: 26/1481b lim: 100 exec/s: 36 rss: 73Mb L: 88/90 MS: 1 ChangeByte- 00:08:39.932 [2024-12-17 01:24:25.680367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.680393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.680441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213772 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.680458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.680512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.680527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.680583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.680599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.932 #37 NEW cov: 12504 ft: 15595 corp: 27/1569b lim: 100 exec/s: 37 rss: 73Mb L: 88/90 MS: 1 ChangeBinInt- 00:08:39.932 [2024-12-17 01:24:25.720135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.720163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.720208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.720225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.932 #38 NEW cov: 12504 ft: 15643 corp: 28/1620b lim: 100 exec/s: 38 rss: 73Mb L: 51/90 MS: 1 ChangeByte- 00:08:39.932 [2024-12-17 01:24:25.760591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.760618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.760672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.760688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.760742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.760758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.760816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.760831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.932 #39 NEW cov: 12504 ft: 15651 corp: 29/1716b lim: 100 exec/s: 39 rss: 73Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:39.932 [2024-12-17 01:24:25.800387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.800414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.800456] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.800473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.932 #40 NEW cov: 12504 ft: 15673 corp: 30/1767b lim: 100 exec/s: 40 rss: 73Mb L: 51/96 MS: 1 ShuffleBytes- 00:08:39.932 [2024-12-17 01:24:25.860729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.860756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.860812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.860830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.860903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.860920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.932 #41 NEW cov: 12504 ft: 15689 corp: 31/1836b lim: 100 exec/s: 41 rss: 73Mb L: 69/96 MS: 1 ShuffleBytes- 00:08:39.932 [2024-12-17 01:24:25.920758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:25601 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.920785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.932 [2024-12-17 01:24:25.920827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131890428167236 len:69 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.932 [2024-12-17 01:24:25.920841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.191 #42 NEW cov: 12504 ft: 15700 corp: 32/1891b lim: 100 exec/s: 42 rss: 73Mb L: 55/96 MS: 1 PersAutoDict- DE: "d\000\000\000"- 00:08:40.191 [2024-12-17 01:24:25.960979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.191 [2024-12-17 01:24:25.961006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:25.961055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131756037798393 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:25.961071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:25.961126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:25.961142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.192 #43 NEW cov: 12504 ft: 15725 corp: 33/1957b lim: 100 exec/s: 43 rss: 74Mb L: 66/96 MS: 1 InsertRepeatedBytes- 00:08:40.192 [2024-12-17 01:24:26.020989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:25601 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.021014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:26.021052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919166645303526468 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.021069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.192 #44 NEW cov: 12504 ft: 15762 corp: 34/2011b lim: 100 exec/s: 44 rss: 74Mb L: 54/96 MS: 1 EraseBytes- 00:08:40.192 [2024-12-17 01:24:26.081482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:217020520795931395 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.081509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:26.081557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:217020518514230019 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.081573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:26.081628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:217020518514230019 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.081644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:26.081700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:217020518514230055 len:772 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.081717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.192 #45 NEW cov: 12504 ft: 15774 corp: 35/2102b lim: 100 exec/s: 45 rss: 74Mb L: 91/96 MS: 1 InsertByte- 00:08:40.192 [2024-12-17 01:24:26.141637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4919131752016135236 len:46405 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.141664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:26.141718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.141736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:26.141797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.141813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.192 [2024-12-17 01:24:26.141882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4919131752989213764 len:17477 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.192 [2024-12-17 01:24:26.141898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.192 #46 NEW cov: 12504 ft: 15775 corp: 36/2190b lim: 100 exec/s: 23 rss: 74Mb L: 88/96 MS: 1 ChangeBinInt- 00:08:40.192 #46 DONE cov: 12504 ft: 15775 corp: 36/2190b lim: 100 exec/s: 23 rss: 74Mb 00:08:40.192 ###### Recommended dictionary. ###### 00:08:40.192 "d\000\000\000" # Uses: 2 00:08:40.192 ###### End of recommended dictionary. ###### 00:08:40.192 Done 46 runs in 2 second(s) 00:08:40.451 01:24:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:40.451 01:24:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:40.452 01:24:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.452 01:24:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:40.452 00:08:40.452 real 1m3.711s 00:08:40.452 user 1m39.619s 00:08:40.452 sys 0m7.950s 00:08:40.452 01:24:26 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:40.452 01:24:26 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:40.452 ************************************ 00:08:40.452 END TEST nvmf_llvm_fuzz 00:08:40.452 ************************************ 00:08:40.452 01:24:26 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:40.452 01:24:26 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:40.452 01:24:26 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:40.452 01:24:26 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:40.452 01:24:26 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:40.452 01:24:26 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:40.452 ************************************ 00:08:40.452 START TEST vfio_llvm_fuzz 00:08:40.452 ************************************ 00:08:40.452 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:40.713 * Looking for test storage... 00:08:40.713 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:40.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.713 --rc genhtml_branch_coverage=1 00:08:40.713 --rc genhtml_function_coverage=1 00:08:40.713 --rc genhtml_legend=1 00:08:40.713 --rc geninfo_all_blocks=1 00:08:40.713 --rc geninfo_unexecuted_blocks=1 00:08:40.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.713 ' 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:40.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.713 --rc genhtml_branch_coverage=1 00:08:40.713 --rc genhtml_function_coverage=1 00:08:40.713 --rc genhtml_legend=1 00:08:40.713 --rc geninfo_all_blocks=1 00:08:40.713 --rc geninfo_unexecuted_blocks=1 00:08:40.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.713 ' 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:40.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.713 --rc genhtml_branch_coverage=1 00:08:40.713 --rc genhtml_function_coverage=1 00:08:40.713 --rc genhtml_legend=1 00:08:40.713 --rc geninfo_all_blocks=1 00:08:40.713 --rc geninfo_unexecuted_blocks=1 00:08:40.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.713 ' 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:40.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.713 --rc genhtml_branch_coverage=1 00:08:40.713 --rc genhtml_function_coverage=1 00:08:40.713 --rc genhtml_legend=1 00:08:40.713 --rc geninfo_all_blocks=1 00:08:40.713 --rc geninfo_unexecuted_blocks=1 00:08:40.713 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.713 ' 00:08:40.713 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_SHARED=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_FC=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_URING=n 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:40.714 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:40.714 #define SPDK_CONFIG_H 00:08:40.714 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:40.714 #define SPDK_CONFIG_APPS 1 00:08:40.714 #define SPDK_CONFIG_ARCH native 00:08:40.714 #undef SPDK_CONFIG_ASAN 00:08:40.714 #undef SPDK_CONFIG_AVAHI 00:08:40.714 #undef SPDK_CONFIG_CET 00:08:40.715 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:40.715 #define SPDK_CONFIG_COVERAGE 1 00:08:40.715 #define SPDK_CONFIG_CROSS_PREFIX 00:08:40.715 #undef SPDK_CONFIG_CRYPTO 00:08:40.715 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:40.715 #undef SPDK_CONFIG_CUSTOMOCF 00:08:40.715 #undef SPDK_CONFIG_DAOS 00:08:40.715 #define SPDK_CONFIG_DAOS_DIR 00:08:40.715 #define SPDK_CONFIG_DEBUG 1 00:08:40.715 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:40.715 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:40.715 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:40.715 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:40.715 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:40.715 #undef SPDK_CONFIG_DPDK_UADK 00:08:40.715 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:40.715 #define SPDK_CONFIG_EXAMPLES 1 00:08:40.715 #undef SPDK_CONFIG_FC 00:08:40.715 #define SPDK_CONFIG_FC_PATH 00:08:40.715 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:40.715 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:40.715 #define SPDK_CONFIG_FSDEV 1 00:08:40.715 #undef SPDK_CONFIG_FUSE 00:08:40.715 #define SPDK_CONFIG_FUZZER 1 00:08:40.715 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:40.715 #undef SPDK_CONFIG_GOLANG 00:08:40.715 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:40.715 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:40.715 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:40.715 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:40.715 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:40.715 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:40.715 #undef SPDK_CONFIG_HAVE_LZ4 00:08:40.715 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:40.715 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:40.715 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:40.715 #define SPDK_CONFIG_IDXD 1 00:08:40.715 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:40.715 #undef SPDK_CONFIG_IPSEC_MB 00:08:40.715 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:40.715 #define SPDK_CONFIG_ISAL 1 00:08:40.715 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:40.715 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:40.715 #define SPDK_CONFIG_LIBDIR 00:08:40.715 #undef SPDK_CONFIG_LTO 00:08:40.715 #define SPDK_CONFIG_MAX_LCORES 128 00:08:40.715 #define SPDK_CONFIG_NVME_CUSE 1 00:08:40.715 #undef SPDK_CONFIG_OCF 00:08:40.715 #define SPDK_CONFIG_OCF_PATH 00:08:40.715 #define SPDK_CONFIG_OPENSSL_PATH 00:08:40.715 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:40.715 #define SPDK_CONFIG_PGO_DIR 00:08:40.715 #undef SPDK_CONFIG_PGO_USE 00:08:40.715 #define SPDK_CONFIG_PREFIX /usr/local 00:08:40.715 #undef SPDK_CONFIG_RAID5F 00:08:40.715 #undef SPDK_CONFIG_RBD 00:08:40.715 #define SPDK_CONFIG_RDMA 1 00:08:40.715 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:40.715 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:40.715 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:40.715 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:40.715 #undef SPDK_CONFIG_SHARED 00:08:40.715 #undef SPDK_CONFIG_SMA 00:08:40.715 #define SPDK_CONFIG_TESTS 1 00:08:40.715 #undef SPDK_CONFIG_TSAN 00:08:40.715 #define SPDK_CONFIG_UBLK 1 00:08:40.715 #define SPDK_CONFIG_UBSAN 1 00:08:40.715 #undef SPDK_CONFIG_UNIT_TESTS 00:08:40.715 #undef SPDK_CONFIG_URING 00:08:40.715 #define SPDK_CONFIG_URING_PATH 00:08:40.715 #undef SPDK_CONFIG_URING_ZNS 00:08:40.715 #undef SPDK_CONFIG_USDT 00:08:40.715 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:40.715 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:40.715 #define SPDK_CONFIG_VFIO_USER 1 00:08:40.715 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:40.715 #define SPDK_CONFIG_VHOST 1 00:08:40.715 #define SPDK_CONFIG_VIRTIO 1 00:08:40.715 #undef SPDK_CONFIG_VTUNE 00:08:40.715 #define SPDK_CONFIG_VTUNE_DIR 00:08:40.715 #define SPDK_CONFIG_WERROR 1 00:08:40.715 #define SPDK_CONFIG_WPDK_DIR 00:08:40.715 #undef SPDK_CONFIG_XNVME 00:08:40.715 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:40.715 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:40.716 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 834836 ]] 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 834836 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.Bgu86L 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.Bgu86L/tests/vfio /tmp/spdk.Bgu86L 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=785162240 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4499267584 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=52893396992 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730607104 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=8837210112 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30861873152 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865301504 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=3428352 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340121600 00:08:40.717 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=6000640 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30864969728 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=335872 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173044736 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173057024 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:08:40.718 * Looking for test storage... 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:08:40.718 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:08:40.977 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.977 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:40.977 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:08:40.977 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=52893396992 00:08:40.977 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:08:40.977 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=11051802624 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.978 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1668 -- # set -o errtrace 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1673 -- # true 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1675 -- # xtrace_fd 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:40.978 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.978 --rc genhtml_branch_coverage=1 00:08:40.978 --rc genhtml_function_coverage=1 00:08:40.978 --rc genhtml_legend=1 00:08:40.978 --rc geninfo_all_blocks=1 00:08:40.978 --rc geninfo_unexecuted_blocks=1 00:08:40.978 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.978 ' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:40.978 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.978 --rc genhtml_branch_coverage=1 00:08:40.978 --rc genhtml_function_coverage=1 00:08:40.978 --rc genhtml_legend=1 00:08:40.978 --rc geninfo_all_blocks=1 00:08:40.978 --rc geninfo_unexecuted_blocks=1 00:08:40.978 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.978 ' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:40.978 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.978 --rc genhtml_branch_coverage=1 00:08:40.978 --rc genhtml_function_coverage=1 00:08:40.978 --rc genhtml_legend=1 00:08:40.978 --rc geninfo_all_blocks=1 00:08:40.978 --rc geninfo_unexecuted_blocks=1 00:08:40.978 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.978 ' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:40.978 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:40.978 --rc genhtml_branch_coverage=1 00:08:40.978 --rc genhtml_function_coverage=1 00:08:40.978 --rc genhtml_legend=1 00:08:40.978 --rc geninfo_all_blocks=1 00:08:40.978 --rc geninfo_unexecuted_blocks=1 00:08:40.978 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:40.978 ' 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:40.978 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:40.978 01:24:26 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:40.978 [2024-12-17 01:24:26.875030] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:40.978 [2024-12-17 01:24:26.875105] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid834989 ] 00:08:40.978 [2024-12-17 01:24:26.947329] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.238 [2024-12-17 01:24:26.986523] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.238 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.238 INFO: Seed: 3602568785 00:08:41.238 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:41.238 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:41.238 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:41.238 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.238 #2 INITED exec/s: 0 rss: 65Mb 00:08:41.238 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.238 This may also happen if the target rejected all inputs we tried so far 00:08:41.238 [2024-12-17 01:24:27.218347] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:41.755 NEW_FUNC[1/668]: 0x4521a8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:41.755 NEW_FUNC[2/668]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:41.755 #53 NEW cov: 11089 ft: 11019 corp: 2/7b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:08:42.014 #54 NEW cov: 11103 ft: 14945 corp: 3/13b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 CopyPart- 00:08:42.272 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:42.272 #55 NEW cov: 11127 ft: 15942 corp: 4/19b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:42.272 #56 NEW cov: 11127 ft: 16946 corp: 5/25b lim: 6 exec/s: 56 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:08:42.529 #62 NEW cov: 11127 ft: 17563 corp: 6/31b lim: 6 exec/s: 62 rss: 74Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:42.787 #63 NEW cov: 11127 ft: 18051 corp: 7/37b lim: 6 exec/s: 63 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:42.787 #64 NEW cov: 11131 ft: 18272 corp: 8/43b lim: 6 exec/s: 64 rss: 74Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:43.045 #65 NEW cov: 11131 ft: 18550 corp: 9/49b lim: 6 exec/s: 65 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:08:43.302 #71 NEW cov: 11138 ft: 18703 corp: 10/55b lim: 6 exec/s: 71 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:43.561 #72 NEW cov: 11138 ft: 19029 corp: 11/61b lim: 6 exec/s: 36 rss: 74Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:43.561 #72 DONE cov: 11138 ft: 19029 corp: 11/61b lim: 6 exec/s: 36 rss: 74Mb 00:08:43.561 Done 72 runs in 2 second(s) 00:08:43.561 [2024-12-17 01:24:29.328975] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:43.561 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:43.819 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:43.819 01:24:29 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:43.819 [2024-12-17 01:24:29.613159] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:43.819 [2024-12-17 01:24:29.613250] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid835432 ] 00:08:43.819 [2024-12-17 01:24:29.684198] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.819 [2024-12-17 01:24:29.722130] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.078 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.078 INFO: Seed: 2047600546 00:08:44.078 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:44.078 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:44.078 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:44.078 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.078 #2 INITED exec/s: 0 rss: 65Mb 00:08:44.078 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.078 This may also happen if the target rejected all inputs we tried so far 00:08:44.078 [2024-12-17 01:24:29.967434] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:44.078 [2024-12-17 01:24:30.013821] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.078 [2024-12-17 01:24:30.013848] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.078 [2024-12-17 01:24:30.013870] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.594 NEW_FUNC[1/668]: 0x452748 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:44.594 NEW_FUNC[2/668]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:44.594 #11 NEW cov: 11076 ft: 11037 corp: 2/5b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 4 CopyPart-InsertByte-InsertByte-CrossOver- 00:08:44.594 [2024-12-17 01:24:30.496185] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.594 [2024-12-17 01:24:30.496221] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.594 [2024-12-17 01:24:30.496239] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.851 NEW_FUNC[1/1]: 0x1f27078 in spdk_get_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1269 00:08:44.851 #15 NEW cov: 11091 ft: 14685 corp: 3/9b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 4 CopyPart-InsertByte-InsertByte-CopyPart- 00:08:44.851 [2024-12-17 01:24:30.689124] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.851 [2024-12-17 01:24:30.689147] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.851 [2024-12-17 01:24:30.689164] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:44.851 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:44.851 #26 NEW cov: 11108 ft: 15415 corp: 4/13b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 CrossOver- 00:08:45.109 [2024-12-17 01:24:30.860573] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.109 [2024-12-17 01:24:30.860595] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.109 [2024-12-17 01:24:30.860612] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.109 #27 NEW cov: 11108 ft: 15895 corp: 5/17b lim: 4 exec/s: 27 rss: 74Mb L: 4/4 MS: 1 ChangeBit- 00:08:45.109 [2024-12-17 01:24:31.035042] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.109 [2024-12-17 01:24:31.035065] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.109 [2024-12-17 01:24:31.035082] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.367 #32 NEW cov: 11108 ft: 16493 corp: 6/21b lim: 4 exec/s: 32 rss: 74Mb L: 4/4 MS: 5 ChangeByte-CopyPart-ChangeByte-ChangeByte-CopyPart- 00:08:45.367 [2024-12-17 01:24:31.212732] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.367 [2024-12-17 01:24:31.212755] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.367 [2024-12-17 01:24:31.212772] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.367 #43 NEW cov: 11108 ft: 16597 corp: 7/25b lim: 4 exec/s: 43 rss: 74Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:45.624 [2024-12-17 01:24:31.386144] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.624 [2024-12-17 01:24:31.386166] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.624 [2024-12-17 01:24:31.386183] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.624 #44 NEW cov: 11108 ft: 16963 corp: 8/29b lim: 4 exec/s: 44 rss: 74Mb L: 4/4 MS: 1 ChangeByte- 00:08:45.624 [2024-12-17 01:24:31.557448] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.624 [2024-12-17 01:24:31.557470] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.624 [2024-12-17 01:24:31.557487] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.882 #45 NEW cov: 11108 ft: 17194 corp: 9/33b lim: 4 exec/s: 45 rss: 74Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:45.882 [2024-12-17 01:24:31.733003] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.882 [2024-12-17 01:24:31.733025] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.882 [2024-12-17 01:24:31.733045] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.882 #46 NEW cov: 11115 ft: 17332 corp: 10/37b lim: 4 exec/s: 46 rss: 74Mb L: 4/4 MS: 1 CopyPart- 00:08:46.140 [2024-12-17 01:24:31.909485] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:46.140 [2024-12-17 01:24:31.909507] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:46.140 [2024-12-17 01:24:31.909525] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:46.140 #47 NEW cov: 11115 ft: 17362 corp: 11/41b lim: 4 exec/s: 23 rss: 74Mb L: 4/4 MS: 1 ChangeByte- 00:08:46.140 #47 DONE cov: 11115 ft: 17362 corp: 11/41b lim: 4 exec/s: 23 rss: 74Mb 00:08:46.140 Done 47 runs in 2 second(s) 00:08:46.140 [2024-12-17 01:24:32.031012] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:46.398 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:46.398 01:24:32 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:46.398 [2024-12-17 01:24:32.314231] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:46.398 [2024-12-17 01:24:32.314297] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid835968 ] 00:08:46.398 [2024-12-17 01:24:32.384586] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.655 [2024-12-17 01:24:32.422588] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.655 INFO: Running with entropic power schedule (0xFF, 100). 00:08:46.655 INFO: Seed: 443636449 00:08:46.655 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:46.655 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:46.655 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:46.655 INFO: A corpus is not provided, starting from an empty corpus 00:08:46.655 #2 INITED exec/s: 0 rss: 65Mb 00:08:46.655 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:46.655 This may also happen if the target rejected all inputs we tried so far 00:08:46.655 [2024-12-17 01:24:32.655923] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:46.913 [2024-12-17 01:24:32.696159] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.170 NEW_FUNC[1/669]: 0x453138 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:08:47.170 NEW_FUNC[2/669]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:47.170 #28 NEW cov: 11058 ft: 10858 corp: 2/9b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:47.170 [2024-12-17 01:24:33.174156] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.429 #29 NEW cov: 11075 ft: 13790 corp: 3/17b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:47.429 [2024-12-17 01:24:33.361680] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.688 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:47.688 #30 NEW cov: 11092 ft: 14389 corp: 4/25b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 ChangeBit- 00:08:47.688 [2024-12-17 01:24:33.553635] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.688 #51 NEW cov: 11092 ft: 14775 corp: 5/33b lim: 8 exec/s: 51 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:08:47.947 [2024-12-17 01:24:33.738817] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:47.947 #57 NEW cov: 11092 ft: 15840 corp: 6/41b lim: 8 exec/s: 57 rss: 74Mb L: 8/8 MS: 1 ChangeBit- 00:08:47.947 [2024-12-17 01:24:33.923246] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.204 #63 NEW cov: 11092 ft: 16106 corp: 7/49b lim: 8 exec/s: 63 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:08:48.204 [2024-12-17 01:24:34.107397] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.462 #64 NEW cov: 11092 ft: 16418 corp: 8/57b lim: 8 exec/s: 64 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:08:48.462 [2024-12-17 01:24:34.291951] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.462 #67 NEW cov: 11092 ft: 16884 corp: 9/65b lim: 8 exec/s: 67 rss: 75Mb L: 8/8 MS: 3 EraseBytes-InsertRepeatedBytes-CrossOver- 00:08:48.719 [2024-12-17 01:24:34.480610] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.719 #73 NEW cov: 11099 ft: 17068 corp: 10/73b lim: 8 exec/s: 73 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:48.719 [2024-12-17 01:24:34.669440] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.977 #74 NEW cov: 11099 ft: 17277 corp: 11/81b lim: 8 exec/s: 37 rss: 75Mb L: 8/8 MS: 1 CrossOver- 00:08:48.977 #74 DONE cov: 11099 ft: 17277 corp: 11/81b lim: 8 exec/s: 37 rss: 75Mb 00:08:48.977 Done 74 runs in 2 second(s) 00:08:48.977 [2024-12-17 01:24:34.795983] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:49.238 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:49.239 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:49.239 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:49.239 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:49.239 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:49.239 01:24:35 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:49.239 [2024-12-17 01:24:35.086143] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:49.239 [2024-12-17 01:24:35.086211] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid836505 ] 00:08:49.239 [2024-12-17 01:24:35.158176] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.239 [2024-12-17 01:24:35.196169] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.498 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.498 INFO: Seed: 3220634539 00:08:49.498 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:49.498 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:49.498 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:49.498 INFO: A corpus is not provided, starting from an empty corpus 00:08:49.498 #2 INITED exec/s: 0 rss: 65Mb 00:08:49.498 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:49.498 This may also happen if the target rejected all inputs we tried so far 00:08:49.498 [2024-12-17 01:24:35.437576] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:08:50.013 NEW_FUNC[1/668]: 0x453828 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:08:50.013 NEW_FUNC[2/668]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:50.013 #209 NEW cov: 11027 ft: 11026 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:50.271 NEW_FUNC[1/1]: 0x1353e18 in nvmf_bdev_ctrlr_read_cmd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr_bdev.c:336 00:08:50.271 #210 NEW cov: 11080 ft: 14367 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:08:50.271 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:50.271 #211 NEW cov: 11097 ft: 14834 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:50.529 #212 NEW cov: 11097 ft: 16014 corp: 5/129b lim: 32 exec/s: 212 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:50.787 #213 NEW cov: 11097 ft: 16827 corp: 6/161b lim: 32 exec/s: 213 rss: 74Mb L: 32/32 MS: 1 ChangeASCIIInt- 00:08:50.787 #214 NEW cov: 11097 ft: 17021 corp: 7/193b lim: 32 exec/s: 214 rss: 74Mb L: 32/32 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:51.044 #215 NEW cov: 11097 ft: 17189 corp: 8/225b lim: 32 exec/s: 215 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:08:51.302 #216 NEW cov: 11097 ft: 17288 corp: 9/257b lim: 32 exec/s: 216 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:51.302 #227 NEW cov: 11104 ft: 17406 corp: 10/289b lim: 32 exec/s: 227 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:51.560 #228 NEW cov: 11104 ft: 17558 corp: 11/321b lim: 32 exec/s: 114 rss: 75Mb L: 32/32 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:51.560 #228 DONE cov: 11104 ft: 17558 corp: 11/321b lim: 32 exec/s: 114 rss: 75Mb 00:08:51.560 ###### Recommended dictionary. ###### 00:08:51.560 "\377\377\377\377" # Uses: 3 00:08:51.560 ###### End of recommended dictionary. ###### 00:08:51.560 Done 228 runs in 2 second(s) 00:08:51.560 [2024-12-17 01:24:37.509976] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:51.818 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:51.818 01:24:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:51.818 [2024-12-17 01:24:37.788782] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:51.819 [2024-12-17 01:24:37.788860] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid836944 ] 00:08:52.077 [2024-12-17 01:24:37.858181] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.077 [2024-12-17 01:24:37.896350] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.077 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.077 INFO: Seed: 1628679390 00:08:52.335 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:52.335 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:52.335 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:52.335 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.335 #2 INITED exec/s: 0 rss: 65Mb 00:08:52.335 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.335 This may also happen if the target rejected all inputs we tried so far 00:08:52.335 [2024-12-17 01:24:38.136645] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:08:52.335 [2024-12-17 01:24:38.187831] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 5642533481369980494 > max 8796093022208 00:08:52.335 [2024-12-17 01:24:38.187856] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x4e4e4e0a00000000, 0x9c9c9c584e4e4e4e) offset=0x4e4e4e4e4e4e4e4e flags=0x3: No space left on device 00:08:52.335 [2024-12-17 01:24:38.187866] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:52.335 [2024-12-17 01:24:38.187898] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:52.335 [2024-12-17 01:24:38.188831] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x4e4e4e0a00000000, 0x9c9c9c584e4e4e4e) flags=0: No such file or directory 00:08:52.335 [2024-12-17 01:24:38.188844] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:52.335 [2024-12-17 01:24:38.188859] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:52.593 NEW_FUNC[1/669]: 0x4540a8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:08:52.593 NEW_FUNC[2/669]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:52.593 #100 NEW cov: 11081 ft: 11043 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 3 InsertRepeatedBytes-EraseBytes-InsertRepeatedBytes- 00:08:52.893 [2024-12-17 01:24:38.659178] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 5629499534213120000 > max 8796093022208 00:08:52.893 [2024-12-17 01:24:38.659211] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x4e4e0a00000000, 0x4e6e4e0a00000000) offset=0x4e4e4e4e4e4e4e4e flags=0x3: No space left on device 00:08:52.893 [2024-12-17 01:24:38.659223] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:52.893 [2024-12-17 01:24:38.659240] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:52.893 [2024-12-17 01:24:38.660195] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x4e4e0a00000000, 0x4e6e4e0a00000000) flags=0: No such file or directory 00:08:52.893 [2024-12-17 01:24:38.660214] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:52.893 [2024-12-17 01:24:38.660231] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:52.893 NEW_FUNC[1/1]: 0x15b97b8 in sq_dbl_tailp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:572 00:08:52.893 #111 NEW cov: 11098 ft: 14282 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:52.893 [2024-12-17 01:24:38.862944] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 5642533481369980502 > max 8796093022208 00:08:52.893 [2024-12-17 01:24:38.862969] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x4e4e4e0a00000000, 0x9c9c9c584e4e4e56) offset=0x4e4e4e4e4e4e4e4e flags=0x3: No space left on device 00:08:52.893 [2024-12-17 01:24:38.862981] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:52.893 [2024-12-17 01:24:38.862999] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:52.893 [2024-12-17 01:24:38.863952] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x4e4e4e0a00000000, 0x9c9c9c584e4e4e56) flags=0: No such file or directory 00:08:52.893 [2024-12-17 01:24:38.863971] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:52.893 [2024-12-17 01:24:38.863988] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:53.218 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:53.218 #127 NEW cov: 11115 ft: 14774 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:53.218 [2024-12-17 01:24:39.057775] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 5642533481369980502 > max 8796093022208 00:08:53.218 [2024-12-17 01:24:39.057807] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x4e0a004e4e000000, 0x9c584e9c9c4e4e56) offset=0x4e4e4e4e4e4e4e4e flags=0x3: No space left on device 00:08:53.218 [2024-12-17 01:24:39.057818] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:53.218 [2024-12-17 01:24:39.057835] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:53.218 [2024-12-17 01:24:39.058775] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x4e0a004e4e000000, 0x9c584e9c9c4e4e56) flags=0: No such file or directory 00:08:53.218 [2024-12-17 01:24:39.058800] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:53.218 [2024-12-17 01:24:39.058817] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:53.218 #128 NEW cov: 11115 ft: 15604 corp: 5/129b lim: 32 exec/s: 128 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:53.531 [2024-12-17 01:24:39.262472] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 5642533481369980494 > max 8796093022208 00:08:53.531 [2024-12-17 01:24:39.262495] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x4e4e4e4e4e4e4e4e, 0x9c9c9c9c9c9c9c9c) offset=0x4e4e4e4e4e4e4e4e flags=0x3: No space left on device 00:08:53.531 [2024-12-17 01:24:39.262506] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:53.531 [2024-12-17 01:24:39.262523] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:53.531 [2024-12-17 01:24:39.263449] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x4e4e4e4e4e4e4e4e, 0x9c9c9c9c9c9c9c9c) flags=0: No such file or directory 00:08:53.531 [2024-12-17 01:24:39.263468] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:53.531 [2024-12-17 01:24:39.263484] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:53.531 #129 NEW cov: 11115 ft: 16018 corp: 6/161b lim: 32 exec/s: 129 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:53.531 [2024-12-17 01:24:39.453519] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 5620492489577201750 > max 8796093022208 00:08:53.531 [2024-12-17 01:24:39.453542] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x4e4e4e0a00000000, 0x9c4e4e2e00000056) offset=0xa4e284e4e4e4e4e flags=0x3: No space left on device 00:08:53.531 [2024-12-17 01:24:39.453552] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:53.531 [2024-12-17 01:24:39.453584] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:53.531 [2024-12-17 01:24:39.454545] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x4e4e4e0a00000000, 0x9c4e4e2e00000056) flags=0: No such file or directory 00:08:53.531 [2024-12-17 01:24:39.454564] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:53.531 [2024-12-17 01:24:39.454581] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:53.789 #138 NEW cov: 11115 ft: 16087 corp: 7/193b lim: 32 exec/s: 138 rss: 74Mb L: 32/32 MS: 4 EraseBytes-CrossOver-ChangeByte-InsertByte- 00:08:53.789 [2024-12-17 01:24:39.652263] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 22041146411601486 > max 8796093022208 00:08:53.789 [2024-12-17 01:24:39.652285] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x4e4e4e0a00000000, 0x4e9c9c584e4e4e4e) offset=0x4e4e4e4e4e4e0a00 flags=0x3: No space left on device 00:08:53.789 [2024-12-17 01:24:39.652295] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:53.789 [2024-12-17 01:24:39.652326] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:53.789 [2024-12-17 01:24:39.653302] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x4e4e4e0a00000000, 0x4e9c9c584e4e4e4e) flags=0: No such file or directory 00:08:53.789 [2024-12-17 01:24:39.653321] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:53.789 [2024-12-17 01:24:39.653336] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:53.789 #139 NEW cov: 11115 ft: 16207 corp: 8/225b lim: 32 exec/s: 139 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:08:54.047 [2024-12-17 01:24:39.846405] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [0xa00000000000000, 0xa000000564e4e4e) fd=329 offset=0x4e4e4e4e4e4e0a00 prot=0x3: Permission denied 00:08:54.047 [2024-12-17 01:24:39.846427] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0xa00000000000000, 0xa000000564e4e4e) offset=0x4e4e4e4e4e4e0a00 flags=0x3: Permission denied 00:08:54.047 [2024-12-17 01:24:39.846437] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Permission denied 00:08:54.047 [2024-12-17 01:24:39.846469] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.047 [2024-12-17 01:24:39.847407] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0xa00000000000000, 0xa000000564e4e4e) flags=0: No such file or directory 00:08:54.047 [2024-12-17 01:24:39.847425] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:54.047 [2024-12-17 01:24:39.847441] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:54.047 #140 NEW cov: 11122 ft: 16439 corp: 9/257b lim: 32 exec/s: 140 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:08:54.047 [2024-12-17 01:24:40.039255] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 5642533481369980494 > max 8796093022208 00:08:54.047 [2024-12-17 01:24:40.039279] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x4e4e4e4e4e4e4e4e, 0x9c9c9c9c9c9c9c9c) offset=0x4eb1ac4e4e4e4e4e flags=0x3: No space left on device 00:08:54.047 [2024-12-17 01:24:40.039290] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:54.047 [2024-12-17 01:24:40.039306] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.047 [2024-12-17 01:24:40.040262] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x4e4e4e4e4e4e4e4e, 0x9c9c9c9c9c9c9c9c) flags=0: No such file or directory 00:08:54.047 [2024-12-17 01:24:40.040281] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:54.047 [2024-12-17 01:24:40.040297] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:54.305 #141 NEW cov: 11122 ft: 16769 corp: 10/289b lim: 32 exec/s: 70 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:54.305 #141 DONE cov: 11122 ft: 16769 corp: 10/289b lim: 32 exec/s: 70 rss: 74Mb 00:08:54.305 Done 141 runs in 2 second(s) 00:08:54.305 [2024-12-17 01:24:40.171995] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:54.563 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:54.563 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:54.564 01:24:40 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:54.564 [2024-12-17 01:24:40.456815] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:54.564 [2024-12-17 01:24:40.456885] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid837341 ] 00:08:54.564 [2024-12-17 01:24:40.527971] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:54.564 [2024-12-17 01:24:40.566541] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.821 INFO: Running with entropic power schedule (0xFF, 100). 00:08:54.821 INFO: Seed: 2711414 00:08:54.821 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:54.821 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:54.822 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:54.822 INFO: A corpus is not provided, starting from an empty corpus 00:08:54.822 #2 INITED exec/s: 0 rss: 65Mb 00:08:54.822 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:54.822 This may also happen if the target rejected all inputs we tried so far 00:08:54.822 [2024-12-17 01:24:40.807128] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:08:55.080 [2024-12-17 01:24:40.858837] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.080 [2024-12-17 01:24:40.858874] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.337 NEW_FUNC[1/667]: 0x454aa8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:08:55.337 NEW_FUNC[2/667]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:55.337 #5 NEW cov: 10957 ft: 11047 corp: 2/14b lim: 13 exec/s: 0 rss: 72Mb L: 13/13 MS: 3 CMP-ChangeByte-InsertRepeatedBytes- DE: "\000l"- 00:08:55.337 [2024-12-17 01:24:41.316716] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.337 [2024-12-17 01:24:41.316755] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.594 NEW_FUNC[1/3]: 0x45a5d8 in write_complete /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:353 00:08:55.594 NEW_FUNC[2/3]: 0x45b518 in read_complete /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:324 00:08:55.594 #11 NEW cov: 11098 ft: 14419 corp: 3/27b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 1 ChangeByte- 00:08:55.594 [2024-12-17 01:24:41.493728] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.594 [2024-12-17 01:24:41.493758] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.851 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:55.851 #12 NEW cov: 11115 ft: 15990 corp: 4/40b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 ShuffleBytes- 00:08:55.851 [2024-12-17 01:24:41.682059] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.851 [2024-12-17 01:24:41.682090] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.851 #13 NEW cov: 11115 ft: 16205 corp: 5/53b lim: 13 exec/s: 13 rss: 74Mb L: 13/13 MS: 1 CrossOver- 00:08:56.109 [2024-12-17 01:24:41.859667] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.109 [2024-12-17 01:24:41.859697] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.109 #19 NEW cov: 11115 ft: 16558 corp: 6/66b lim: 13 exec/s: 19 rss: 74Mb L: 13/13 MS: 1 CrossOver- 00:08:56.109 [2024-12-17 01:24:42.039420] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.109 [2024-12-17 01:24:42.039450] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.367 #20 NEW cov: 11115 ft: 16721 corp: 7/79b lim: 13 exec/s: 20 rss: 74Mb L: 13/13 MS: 1 ChangeByte- 00:08:56.367 [2024-12-17 01:24:42.222354] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.367 [2024-12-17 01:24:42.222384] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.367 #21 NEW cov: 11115 ft: 16899 corp: 8/92b lim: 13 exec/s: 21 rss: 74Mb L: 13/13 MS: 1 ShuffleBytes- 00:08:56.624 [2024-12-17 01:24:42.405901] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.624 [2024-12-17 01:24:42.405930] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.624 #27 NEW cov: 11115 ft: 17207 corp: 9/105b lim: 13 exec/s: 27 rss: 74Mb L: 13/13 MS: 1 ChangeByte- 00:08:56.624 [2024-12-17 01:24:42.582395] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.624 [2024-12-17 01:24:42.582425] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.882 #28 NEW cov: 11122 ft: 17713 corp: 10/118b lim: 13 exec/s: 28 rss: 75Mb L: 13/13 MS: 1 CopyPart- 00:08:56.882 [2024-12-17 01:24:42.768322] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.882 [2024-12-17 01:24:42.768352] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.882 #31 NEW cov: 11122 ft: 18001 corp: 11/131b lim: 13 exec/s: 15 rss: 75Mb L: 13/13 MS: 3 CrossOver-ChangeASCIIInt-InsertByte- 00:08:56.882 #31 DONE cov: 11122 ft: 18001 corp: 11/131b lim: 13 exec/s: 15 rss: 75Mb 00:08:56.882 ###### Recommended dictionary. ###### 00:08:56.882 "\000l" # Uses: 0 00:08:56.882 ###### End of recommended dictionary. ###### 00:08:56.882 Done 31 runs in 2 second(s) 00:08:57.139 [2024-12-17 01:24:42.896969] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:57.139 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:57.139 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:57.397 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:57.397 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:57.397 01:24:43 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:57.397 [2024-12-17 01:24:43.174684] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:57.397 [2024-12-17 01:24:43.174764] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid837878 ] 00:08:57.397 [2024-12-17 01:24:43.243363] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:57.397 [2024-12-17 01:24:43.281273] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.655 INFO: Running with entropic power schedule (0xFF, 100). 00:08:57.655 INFO: Seed: 2718728200 00:08:57.655 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:57.655 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:57.655 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:57.655 INFO: A corpus is not provided, starting from an empty corpus 00:08:57.655 #2 INITED exec/s: 0 rss: 65Mb 00:08:57.655 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:57.655 This may also happen if the target rejected all inputs we tried so far 00:08:57.655 [2024-12-17 01:24:43.519484] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:08:57.655 [2024-12-17 01:24:43.546822] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.655 [2024-12-17 01:24:43.546853] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.221 NEW_FUNC[1/670]: 0x455798 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:58.221 NEW_FUNC[2/670]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:58.221 #5 NEW cov: 11073 ft: 10969 corp: 2/10b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 3 InsertRepeatedBytes-CMP-InsertRepeatedBytes- DE: "\000 "- 00:08:58.221 [2024-12-17 01:24:43.977972] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.221 [2024-12-17 01:24:43.978014] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.221 #6 NEW cov: 11087 ft: 13362 corp: 3/19b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CopyPart- 00:08:58.221 [2024-12-17 01:24:44.093025] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.221 [2024-12-17 01:24:44.093058] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.221 #12 NEW cov: 11087 ft: 14259 corp: 4/28b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:58.221 [2024-12-17 01:24:44.217125] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.221 [2024-12-17 01:24:44.217159] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.479 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:58.479 #13 NEW cov: 11107 ft: 15412 corp: 5/37b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 ChangeBit- 00:08:58.479 [2024-12-17 01:24:44.341134] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.479 [2024-12-17 01:24:44.341166] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.479 #14 NEW cov: 11107 ft: 15679 corp: 6/46b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 CrossOver- 00:08:58.479 [2024-12-17 01:24:44.455144] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.479 [2024-12-17 01:24:44.455177] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.737 #15 NEW cov: 11107 ft: 15716 corp: 7/55b lim: 9 exec/s: 15 rss: 74Mb L: 9/9 MS: 1 ChangeBit- 00:08:58.737 [2024-12-17 01:24:44.570226] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.737 [2024-12-17 01:24:44.570258] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.737 #16 NEW cov: 11107 ft: 15803 corp: 8/64b lim: 9 exec/s: 16 rss: 74Mb L: 9/9 MS: 1 CrossOver- 00:08:58.737 [2024-12-17 01:24:44.684287] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.737 [2024-12-17 01:24:44.684319] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.995 #17 NEW cov: 11107 ft: 16116 corp: 9/73b lim: 9 exec/s: 17 rss: 74Mb L: 9/9 MS: 1 CopyPart- 00:08:58.995 [2024-12-17 01:24:44.798262] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.995 [2024-12-17 01:24:44.798295] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.995 #18 NEW cov: 11107 ft: 16221 corp: 10/82b lim: 9 exec/s: 18 rss: 75Mb L: 9/9 MS: 1 CMP- DE: "\001\005\311x\201\273\327\026"- 00:08:58.995 [2024-12-17 01:24:44.912240] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.995 [2024-12-17 01:24:44.912272] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.995 #19 NEW cov: 11107 ft: 16500 corp: 11/91b lim: 9 exec/s: 19 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:59.253 [2024-12-17 01:24:45.026418] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.253 [2024-12-17 01:24:45.026450] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.253 #20 NEW cov: 11107 ft: 16514 corp: 12/100b lim: 9 exec/s: 20 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:59.253 [2024-12-17 01:24:45.140464] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.253 [2024-12-17 01:24:45.140495] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.253 #21 NEW cov: 11107 ft: 16780 corp: 13/109b lim: 9 exec/s: 21 rss: 75Mb L: 9/9 MS: 1 ChangeByte- 00:08:59.510 [2024-12-17 01:24:45.265612] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.510 [2024-12-17 01:24:45.265647] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.510 #22 NEW cov: 11114 ft: 17097 corp: 14/118b lim: 9 exec/s: 22 rss: 75Mb L: 9/9 MS: 1 ChangeByte- 00:08:59.510 [2024-12-17 01:24:45.379698] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.510 [2024-12-17 01:24:45.379730] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.510 #28 NEW cov: 11114 ft: 17132 corp: 15/127b lim: 9 exec/s: 28 rss: 75Mb L: 9/9 MS: 1 CMP- DE: "\017\000"- 00:08:59.510 [2024-12-17 01:24:45.503789] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.510 [2024-12-17 01:24:45.503829] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.768 #34 NEW cov: 11114 ft: 17196 corp: 16/136b lim: 9 exec/s: 17 rss: 75Mb L: 9/9 MS: 1 ChangeByte- 00:08:59.768 #34 DONE cov: 11114 ft: 17196 corp: 16/136b lim: 9 exec/s: 17 rss: 75Mb 00:08:59.768 ###### Recommended dictionary. ###### 00:08:59.768 "\000 " # Uses: 0 00:08:59.768 "\001\005\311x\201\273\327\026" # Uses: 0 00:08:59.768 "\017\000" # Uses: 0 00:08:59.768 ###### End of recommended dictionary. ###### 00:08:59.768 Done 34 runs in 2 second(s) 00:08:59.768 [2024-12-17 01:24:45.595971] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:00.026 01:24:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:00.026 01:24:45 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:00.026 01:24:45 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.026 01:24:45 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:00.026 00:09:00.026 real 0m19.467s 00:09:00.027 user 0m27.093s 00:09:00.027 sys 0m1.889s 00:09:00.027 01:24:45 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:00.027 01:24:45 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:00.027 ************************************ 00:09:00.027 END TEST vfio_llvm_fuzz 00:09:00.027 ************************************ 00:09:00.027 00:09:00.027 real 1m23.539s 00:09:00.027 user 2m6.883s 00:09:00.027 sys 0m10.059s 00:09:00.027 01:24:45 llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:00.027 01:24:45 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:00.027 ************************************ 00:09:00.027 END TEST llvm_fuzz 00:09:00.027 ************************************ 00:09:00.027 01:24:45 -- spdk/autotest.sh@374 -- # [[ '' -eq 1 ]] 00:09:00.027 01:24:45 -- spdk/autotest.sh@381 -- # trap - SIGINT SIGTERM EXIT 00:09:00.027 01:24:45 -- spdk/autotest.sh@383 -- # timing_enter post_cleanup 00:09:00.027 01:24:45 -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:00.027 01:24:45 -- common/autotest_common.sh@10 -- # set +x 00:09:00.027 01:24:45 -- spdk/autotest.sh@384 -- # autotest_cleanup 00:09:00.027 01:24:45 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:09:00.027 01:24:45 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:09:00.027 01:24:45 -- common/autotest_common.sh@10 -- # set +x 00:09:06.590 INFO: APP EXITING 00:09:06.590 INFO: killing all VMs 00:09:06.590 INFO: killing vhost app 00:09:06.590 INFO: EXIT DONE 00:09:09.128 Waiting for block devices as requested 00:09:09.128 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:09.128 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:09.128 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:09.128 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:09.128 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:09.128 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:09.386 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:09.386 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:09.386 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:09.386 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:09.645 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:09.645 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:09.645 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:09.904 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:09.904 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:09.904 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:10.162 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:13.449 Cleaning 00:09:13.449 Removing: /dev/shm/spdk_tgt_trace.pid809884 00:09:13.449 Removing: /var/run/dpdk/spdk_pid807423 00:09:13.449 Removing: /var/run/dpdk/spdk_pid808601 00:09:13.449 Removing: /var/run/dpdk/spdk_pid809884 00:09:13.449 Removing: /var/run/dpdk/spdk_pid810344 00:09:13.449 Removing: /var/run/dpdk/spdk_pid811428 00:09:13.449 Removing: /var/run/dpdk/spdk_pid811459 00:09:13.449 Removing: /var/run/dpdk/spdk_pid812557 00:09:13.708 Removing: /var/run/dpdk/spdk_pid812569 00:09:13.708 Removing: /var/run/dpdk/spdk_pid813003 00:09:13.708 Removing: /var/run/dpdk/spdk_pid813332 00:09:13.708 Removing: /var/run/dpdk/spdk_pid813656 00:09:13.708 Removing: /var/run/dpdk/spdk_pid813993 00:09:13.708 Removing: /var/run/dpdk/spdk_pid814108 00:09:13.708 Removing: /var/run/dpdk/spdk_pid814359 00:09:13.708 Removing: /var/run/dpdk/spdk_pid814639 00:09:13.708 Removing: /var/run/dpdk/spdk_pid814961 00:09:13.708 Removing: /var/run/dpdk/spdk_pid815803 00:09:13.708 Removing: /var/run/dpdk/spdk_pid818740 00:09:13.708 Removing: /var/run/dpdk/spdk_pid819024 00:09:13.708 Removing: /var/run/dpdk/spdk_pid819308 00:09:13.708 Removing: /var/run/dpdk/spdk_pid819424 00:09:13.708 Removing: /var/run/dpdk/spdk_pid819885 00:09:13.708 Removing: /var/run/dpdk/spdk_pid820006 00:09:13.708 Removing: /var/run/dpdk/spdk_pid820502 00:09:13.708 Removing: /var/run/dpdk/spdk_pid820665 00:09:13.708 Removing: /var/run/dpdk/spdk_pid820905 00:09:13.708 Removing: /var/run/dpdk/spdk_pid821028 00:09:13.708 Removing: /var/run/dpdk/spdk_pid821166 00:09:13.708 Removing: /var/run/dpdk/spdk_pid821329 00:09:13.708 Removing: /var/run/dpdk/spdk_pid821720 00:09:13.708 Removing: /var/run/dpdk/spdk_pid822000 00:09:13.708 Removing: /var/run/dpdk/spdk_pid822280 00:09:13.708 Removing: /var/run/dpdk/spdk_pid822540 00:09:13.708 Removing: /var/run/dpdk/spdk_pid823121 00:09:13.708 Removing: /var/run/dpdk/spdk_pid823654 00:09:13.708 Removing: /var/run/dpdk/spdk_pid823991 00:09:13.708 Removing: /var/run/dpdk/spdk_pid824479 00:09:13.708 Removing: /var/run/dpdk/spdk_pid825009 00:09:13.708 Removing: /var/run/dpdk/spdk_pid825305 00:09:13.708 Removing: /var/run/dpdk/spdk_pid825830 00:09:13.708 Removing: /var/run/dpdk/spdk_pid826366 00:09:13.708 Removing: /var/run/dpdk/spdk_pid826657 00:09:13.708 Removing: /var/run/dpdk/spdk_pid827189 00:09:13.708 Removing: /var/run/dpdk/spdk_pid827637 00:09:13.708 Removing: /var/run/dpdk/spdk_pid828010 00:09:13.708 Removing: /var/run/dpdk/spdk_pid828544 00:09:13.708 Removing: /var/run/dpdk/spdk_pid828993 00:09:13.708 Removing: /var/run/dpdk/spdk_pid829370 00:09:13.708 Removing: /var/run/dpdk/spdk_pid830009 00:09:13.708 Removing: /var/run/dpdk/spdk_pid830592 00:09:13.708 Removing: /var/run/dpdk/spdk_pid831281 00:09:13.708 Removing: /var/run/dpdk/spdk_pid831816 00:09:13.708 Removing: /var/run/dpdk/spdk_pid832151 00:09:13.708 Removing: /var/run/dpdk/spdk_pid832634 00:09:13.708 Removing: /var/run/dpdk/spdk_pid833138 00:09:13.708 Removing: /var/run/dpdk/spdk_pid833450 00:09:13.708 Removing: /var/run/dpdk/spdk_pid833985 00:09:13.708 Removing: /var/run/dpdk/spdk_pid834407 00:09:13.708 Removing: /var/run/dpdk/spdk_pid834989 00:09:13.708 Removing: /var/run/dpdk/spdk_pid835432 00:09:13.967 Removing: /var/run/dpdk/spdk_pid835968 00:09:13.967 Removing: /var/run/dpdk/spdk_pid836505 00:09:13.967 Removing: /var/run/dpdk/spdk_pid836944 00:09:13.967 Removing: /var/run/dpdk/spdk_pid837341 00:09:13.967 Removing: /var/run/dpdk/spdk_pid837878 00:09:13.967 Clean 00:09:13.967 01:24:59 -- common/autotest_common.sh@1451 -- # return 0 00:09:13.967 01:24:59 -- spdk/autotest.sh@385 -- # timing_exit post_cleanup 00:09:13.967 01:24:59 -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:13.967 01:24:59 -- common/autotest_common.sh@10 -- # set +x 00:09:13.967 01:24:59 -- spdk/autotest.sh@387 -- # timing_exit autotest 00:09:13.967 01:24:59 -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:13.967 01:24:59 -- common/autotest_common.sh@10 -- # set +x 00:09:13.967 01:24:59 -- spdk/autotest.sh@388 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:13.967 01:24:59 -- spdk/autotest.sh@390 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:13.967 01:24:59 -- spdk/autotest.sh@390 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:13.967 01:24:59 -- spdk/autotest.sh@392 -- # [[ y == y ]] 00:09:13.967 01:24:59 -- spdk/autotest.sh@394 -- # hostname 00:09:13.967 01:24:59 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:14.226 geninfo: WARNING: invalid characters removed from testname! 00:09:17.516 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:22.792 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:26.985 01:25:12 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:35.102 01:25:19 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:39.292 01:25:24 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:44.561 01:25:30 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:49.833 01:25:35 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:55.105 01:25:40 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:00.374 01:25:45 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:00.374 01:25:45 -- common/autotest_common.sh@1680 -- $ [[ y == y ]] 00:10:00.374 01:25:45 -- common/autotest_common.sh@1681 -- $ awk '{print $NF}' 00:10:00.374 01:25:45 -- common/autotest_common.sh@1681 -- $ lcov --version 00:10:00.374 01:25:45 -- common/autotest_common.sh@1681 -- $ lt 1.15 2 00:10:00.374 01:25:45 -- scripts/common.sh@373 -- $ cmp_versions 1.15 '<' 2 00:10:00.374 01:25:45 -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:10:00.374 01:25:45 -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:10:00.374 01:25:45 -- scripts/common.sh@336 -- $ IFS=.-: 00:10:00.374 01:25:45 -- scripts/common.sh@336 -- $ read -ra ver1 00:10:00.374 01:25:45 -- scripts/common.sh@337 -- $ IFS=.-: 00:10:00.374 01:25:45 -- scripts/common.sh@337 -- $ read -ra ver2 00:10:00.374 01:25:45 -- scripts/common.sh@338 -- $ local 'op=<' 00:10:00.374 01:25:45 -- scripts/common.sh@340 -- $ ver1_l=2 00:10:00.374 01:25:45 -- scripts/common.sh@341 -- $ ver2_l=1 00:10:00.374 01:25:45 -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:10:00.374 01:25:45 -- scripts/common.sh@344 -- $ case "$op" in 00:10:00.374 01:25:45 -- scripts/common.sh@345 -- $ : 1 00:10:00.374 01:25:45 -- scripts/common.sh@364 -- $ (( v = 0 )) 00:10:00.374 01:25:45 -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:10:00.374 01:25:45 -- scripts/common.sh@365 -- $ decimal 1 00:10:00.374 01:25:45 -- scripts/common.sh@353 -- $ local d=1 00:10:00.374 01:25:45 -- scripts/common.sh@354 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:10:00.374 01:25:45 -- scripts/common.sh@355 -- $ echo 1 00:10:00.374 01:25:45 -- scripts/common.sh@365 -- $ ver1[v]=1 00:10:00.374 01:25:45 -- scripts/common.sh@366 -- $ decimal 2 00:10:00.374 01:25:45 -- scripts/common.sh@353 -- $ local d=2 00:10:00.374 01:25:45 -- scripts/common.sh@354 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:10:00.374 01:25:45 -- scripts/common.sh@355 -- $ echo 2 00:10:00.374 01:25:45 -- scripts/common.sh@366 -- $ ver2[v]=2 00:10:00.374 01:25:45 -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:10:00.374 01:25:45 -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:10:00.374 01:25:45 -- scripts/common.sh@368 -- $ return 0 00:10:00.374 01:25:45 -- common/autotest_common.sh@1682 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:10:00.374 01:25:45 -- common/autotest_common.sh@1694 -- $ export 'LCOV_OPTS= 00:10:00.374 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:00.374 --rc genhtml_branch_coverage=1 00:10:00.374 --rc genhtml_function_coverage=1 00:10:00.374 --rc genhtml_legend=1 00:10:00.375 --rc geninfo_all_blocks=1 00:10:00.375 --rc geninfo_unexecuted_blocks=1 00:10:00.375 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:00.375 ' 00:10:00.375 01:25:45 -- common/autotest_common.sh@1694 -- $ LCOV_OPTS=' 00:10:00.375 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:00.375 --rc genhtml_branch_coverage=1 00:10:00.375 --rc genhtml_function_coverage=1 00:10:00.375 --rc genhtml_legend=1 00:10:00.375 --rc geninfo_all_blocks=1 00:10:00.375 --rc geninfo_unexecuted_blocks=1 00:10:00.375 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:00.375 ' 00:10:00.375 01:25:45 -- common/autotest_common.sh@1695 -- $ export 'LCOV=lcov 00:10:00.375 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:00.375 --rc genhtml_branch_coverage=1 00:10:00.375 --rc genhtml_function_coverage=1 00:10:00.375 --rc genhtml_legend=1 00:10:00.375 --rc geninfo_all_blocks=1 00:10:00.375 --rc geninfo_unexecuted_blocks=1 00:10:00.375 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:00.375 ' 00:10:00.375 01:25:45 -- common/autotest_common.sh@1695 -- $ LCOV='lcov 00:10:00.375 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:00.375 --rc genhtml_branch_coverage=1 00:10:00.375 --rc genhtml_function_coverage=1 00:10:00.375 --rc genhtml_legend=1 00:10:00.375 --rc geninfo_all_blocks=1 00:10:00.375 --rc geninfo_unexecuted_blocks=1 00:10:00.375 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:10:00.375 ' 00:10:00.375 01:25:45 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:00.375 01:25:45 -- scripts/common.sh@15 -- $ shopt -s extglob 00:10:00.375 01:25:45 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:10:00.375 01:25:45 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:00.375 01:25:45 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:00.375 01:25:45 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:00.375 01:25:45 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:00.375 01:25:45 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:00.375 01:25:45 -- paths/export.sh@5 -- $ export PATH 00:10:00.375 01:25:45 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:00.375 01:25:45 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:10:00.375 01:25:45 -- common/autobuild_common.sh@479 -- $ date +%s 00:10:00.375 01:25:45 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1734395145.XXXXXX 00:10:00.375 01:25:45 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1734395145.mJFh8E 00:10:00.375 01:25:45 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:10:00.375 01:25:45 -- common/autobuild_common.sh@485 -- $ '[' -n v22.11.4 ']' 00:10:00.375 01:25:45 -- common/autobuild_common.sh@486 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:00.375 01:25:45 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:10:00.375 01:25:45 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:10:00.375 01:25:45 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:10:00.375 01:25:45 -- common/autobuild_common.sh@495 -- $ get_config_params 00:10:00.375 01:25:45 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:10:00.375 01:25:45 -- common/autotest_common.sh@10 -- $ set +x 00:10:00.375 01:25:45 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:10:00.375 01:25:45 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:10:00.375 01:25:45 -- pm/common@17 -- $ local monitor 00:10:00.375 01:25:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:00.375 01:25:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:00.375 01:25:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:00.375 01:25:45 -- pm/common@21 -- $ date +%s 00:10:00.375 01:25:45 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:00.375 01:25:45 -- pm/common@21 -- $ date +%s 00:10:00.375 01:25:45 -- pm/common@25 -- $ sleep 1 00:10:00.375 01:25:45 -- pm/common@21 -- $ date +%s 00:10:00.375 01:25:45 -- pm/common@21 -- $ date +%s 00:10:00.375 01:25:45 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1734395145 00:10:00.375 01:25:45 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1734395145 00:10:00.375 01:25:45 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1734395145 00:10:00.375 01:25:45 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1734395145 00:10:00.375 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1734395145_collect-vmstat.pm.log 00:10:00.375 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1734395145_collect-cpu-load.pm.log 00:10:00.375 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1734395145_collect-cpu-temp.pm.log 00:10:00.375 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1734395145_collect-bmc-pm.bmc.pm.log 00:10:00.943 01:25:46 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:10:00.943 01:25:46 -- spdk/autopackage.sh@10 -- $ [[ 0 -eq 1 ]] 00:10:00.943 01:25:46 -- spdk/autopackage.sh@14 -- $ timing_finish 00:10:00.943 01:25:46 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:00.943 01:25:46 -- common/autotest_common.sh@737 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:00.943 01:25:46 -- common/autotest_common.sh@740 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:00.943 01:25:46 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:10:00.943 01:25:46 -- pm/common@29 -- $ signal_monitor_resources TERM 00:10:00.943 01:25:46 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:10:00.943 01:25:46 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:00.943 01:25:46 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:10:00.943 01:25:46 -- pm/common@44 -- $ pid=846274 00:10:00.943 01:25:46 -- pm/common@50 -- $ kill -TERM 846274 00:10:00.943 01:25:46 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:00.943 01:25:46 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:10:00.943 01:25:46 -- pm/common@44 -- $ pid=846276 00:10:00.943 01:25:46 -- pm/common@50 -- $ kill -TERM 846276 00:10:00.943 01:25:46 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:00.943 01:25:46 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:10:00.943 01:25:46 -- pm/common@44 -- $ pid=846278 00:10:00.943 01:25:46 -- pm/common@50 -- $ kill -TERM 846278 00:10:00.943 01:25:46 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:00.943 01:25:46 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:10:00.943 01:25:46 -- pm/common@44 -- $ pid=846300 00:10:00.943 01:25:46 -- pm/common@50 -- $ sudo -E kill -TERM 846300 00:10:01.202 + [[ -n 683434 ]] 00:10:01.202 + sudo kill 683434 00:10:01.212 [Pipeline] } 00:10:01.227 [Pipeline] // stage 00:10:01.232 [Pipeline] } 00:10:01.246 [Pipeline] // timeout 00:10:01.251 [Pipeline] } 00:10:01.265 [Pipeline] // catchError 00:10:01.270 [Pipeline] } 00:10:01.284 [Pipeline] // wrap 00:10:01.290 [Pipeline] } 00:10:01.302 [Pipeline] // catchError 00:10:01.312 [Pipeline] stage 00:10:01.314 [Pipeline] { (Epilogue) 00:10:01.328 [Pipeline] catchError 00:10:01.329 [Pipeline] { 00:10:01.342 [Pipeline] echo 00:10:01.343 Cleanup processes 00:10:01.349 [Pipeline] sh 00:10:01.638 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:01.638 846424 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:10:01.638 846839 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:01.652 [Pipeline] sh 00:10:01.937 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:01.937 ++ grep -v 'sudo pgrep' 00:10:01.937 ++ awk '{print $1}' 00:10:01.937 + sudo kill -9 00:10:01.937 + true 00:10:01.950 [Pipeline] sh 00:10:02.235 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:02.235 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:02.235 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:03.612 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:13.601 [Pipeline] sh 00:10:13.887 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:13.887 Artifacts sizes are good 00:10:13.901 [Pipeline] archiveArtifacts 00:10:13.908 Archiving artifacts 00:10:14.077 [Pipeline] sh 00:10:14.484 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:14.499 [Pipeline] cleanWs 00:10:14.508 [WS-CLEANUP] Deleting project workspace... 00:10:14.508 [WS-CLEANUP] Deferred wipeout is used... 00:10:14.515 [WS-CLEANUP] done 00:10:14.517 [Pipeline] } 00:10:14.532 [Pipeline] // catchError 00:10:14.544 [Pipeline] sh 00:10:14.821 + logger -p user.info -t JENKINS-CI 00:10:14.830 [Pipeline] } 00:10:14.843 [Pipeline] // stage 00:10:14.848 [Pipeline] } 00:10:14.862 [Pipeline] // node 00:10:14.867 [Pipeline] End of Pipeline 00:10:14.907 Finished: SUCCESS